“Every once in a while, a new technology, an old problem, and a big idea turn into an innovation” – Dean Kamen
Success in achieving an organization’s mission requires an efficient but effective process. During my time as senior program and portfolio management lead at Johnson & Johnson, one of my primary objectives was to design, introduce and implement new processes to improve outcomes. I spent significant time each year working on the development and alignment of the strategic plan to the business goals. Creating the future vision of the organization always came with the challenges of how to create the systems needed to drive the sought-after breakthrough results. While incremental change is often more palatable and comforting, in my experience the more fundamental and transformational change, often driven by the adoption of innovative technology solutions, is essential to achieving the desired results. Combining technology innovations, with the principles of Design Excellence and thoughtful and deliberate change management can bring new perspective to workflows and processes that remain wedded to the limitations of information flows and intelligence that preceded the current state of technology advancements.
During that time, and even now, there is confusion regarding the difference between Business Process Improvement (BPI) and Business Process Re-engineering (BPR). While often used interchangeably, these two transformations are in fact very different, requiring varying levels of effort and resulting in a wide variance in the outcomes. Distinguishing the difference and understanding how it applies to your organization – especially in the public sector which is often several years behind commercial organizations – is critical to meeting your organization’s mission.
Today, many leaders understand that improving process is critical to achieve a systemic, sustained operational advantage. So, it is no surprise that Business Process Improvement (BPI) techniques are a part of the fabric of many organizations. Sometimes another term is used interchangeably with BPI, and that is Business Process Re-engineering (BPR).
What’s the Difference Between BPI and BPR?
While BPI and BPR are related and rely on many of the same tools and techniques the two approaches have a different objective.
- BPI improves existing processes, removes bottlenecks, and “tweaks” the status quo.
- BPR reimagines the process from a “clean sheet” with a focus on desired outcomes by challenging long standing beliefs and assumptions and establishing new ones.
This can be seen in the distinction of the two different six sigma methodologies deployed for process improvement, DMAIC and DMADV.
- DMAIC (Define, Measure, Analyze, Improve, Control) is consistent with the goals of BPI.
- DMADV (Define, Measure, Analyze, Design, Validate) is much more akin to BPR.
For this reason, ongoing Business Process Improvement must part of an organization’s fabric constantly seeking ways to optimize. The last several decades have proven that processes can always stand to be more efficient – especially considering emerging technology. Improving a process is the safer route because it doesn’t disrupt the status quo. However, it also doesn’t fundamentally change a process to deliver substantially improved outcomes – often just making it more efficient through the application of new software.
Business process improvement can be seen as disruptive, time consuming, uncomfortable, and just plain hard often sending tremors down a leaders’ spine. But should this be the case and even if so, is it still worth doing?
How we got here
Many of our current processes are artifacts of the way information once flowed from one task, function, operation, or person to the next. To prove this, review almost any process flow chart diagram created in the last 40 years. This basis of our understanding of these kinds of “physical” flows is so entrenched in the way we do things that phrases like; “check all the boxes”, “handoff”, and “sign off’ and the images of the manual processes they imply are deeply ingrained in our thoughts and language.
But when one considers how this has evolved and its implications it becomes clear that often business process improvement doesn’t go far enough to bring an organization into the on-demand age often limiting the impact of change.
The research on this change goes back more than 60 years and since its publication the symbiotic nature of technology and process improvement have become even more symbiotic. In 1962, Everett Rogers, a professor of rural psychology, developed a theory in “Diffusion of Innovations” to explain the dispersion of technology adoption in society and organizations. In this work he determined there were five categories of people in the process.
- Innovator’s (2.5%)
- Early Adopters (13.5%)
- Early Majority (34%)
- Late Majority (34%)
- Laggards (16%)
These groups run the spectrum from “first to try” Innovators who are risk takers and explorers, to skeptical and distrusting, resistant to change traditionalist Laggards, who are the last to adopt a technology (if they adopt it at all) after it has gone fully mainstream and is almost unavoidable.
Where the original model suggests that 50% of the targets of technology adoption are at least pragmatists, generations raised on technology are shifting this curve to the left with younger generations more willing to integrate new technologies into their work and day to day lives. This is where the public sector is particularly at risk. Historically, government institutions have fallen into the Laggard category. As a result, innovative younger generations who can transform the business of government are choosing the private sector.
This, in addition to the need to compete globally and locally results in delayed adoption of technology - which cannot continue. The need for change must be a catalyst for organizations to begin experimenting with new technology to help drive not just business process improvement, but to more fundamentally reengineered processes by rallying around technology solutions as the enablers and facilitators.
Re-engineering in Action
Imagine if your organization was still creating content in the same way it did 20 years ago?
Even the process of creating this article you are reading right now would be wholly different. New processes have been developed on top of a technological framework targeting early adopters and eventually penetrated nearly all users.
In the past, a paper copy of a draft of this article would have to be completed, and then physically circulated to interested or advising parties for comment. The integration of any changes, redlines, and comments was a herculean effort of reorganization of several people thoughts that had to be first understood, if legible and then synthesized back into the context of the article, usually erasing the path from where it started to its new version. To crosswalk the draft and revisions, you had to use the separate versions, side by side.
Today, we can invite co-authors, editors, and contributors to collaborate and watch their changes from the outset. They can comment, ask questions, even rewrite if you permit them (for all I know someone changed or corrected some aspect of this sentence in the final revision on my behalf) The iteration cycles are real time, targeted and as if we are writing the paper together in a room with a complete recording of the dialogue of its evolution from concept to finished article.
This changes the way we work; it is a different process. It is not a sequential and closed off series of steps anymore. It has become an open, dynamic, and real-time iterative creation process of ideas, comments, edits, and revision, all tracked and logged so the path to its final state is clear and traceable.
Now think about the problem of the laggard. If the final decision to print this is made by someone who refuses to adopt and participate in the technological marvel of modern content creation, and are waiting to redline a print copy at the 11th hour? The process can be completely subverted and may force everyone back to the long cycle sequence of passing around revisions just to avoid significant efforts that could be undone and become wasted.
Let’s take another example. I remember as a child going on a family vacation where my father let me ride “shotgun”. Equipped with handwritten directions and a map that when unfolded was a big as I was. To get where we wanted to go, we had to look at the geography, find our start point, and chart a course to our destination and any waypoints. As years passed, a device mounted to our dashboard helped guide the way, essentially a screen-based version of the map, just smart enough to know where we were and indicate the upcoming turns and interchanges. Now? In everyone’s pocket is a device that can take you anywhere, depending on your mode (driving, walking, public transportation, etc.) not only does it now where we are, but knows where others are who are using the same technology, how fast they are moving, where hazards have been reported giving travelers the ability to dynamically alter their course to avoid traffic jams, and slow down for hidden speed traps if in too much of a hurry. The evolution of this process of navigating from the folded map and handwritten directions changes mobility. The fear of getting lost, not knowing the way, dealing with detours? These are all fears of the past. So might be your knowledge and understanding the geography of where you live and work that came from looking at a map and charting a course, but that’s easily solved for those interested and wanting to know.
Many powerful and well-designed technology solutions have emerged and changed the way we do things. By automating, simplifying, and assisting us in many activities by layering in access to new tools and competencies we didn’t know we needed. Integrating technology primarily makes organizations more efficient and are designed based on extensive research and updated based on best practices gleaned from thousands or millions of users. Re-developing processes around technology is smart. It creates a sustainable advantage, lowers development costs, and provides ongoing access to process improvements across the customer base.
While business process re-engineering can be intimidating, organizations do not need to start with a blank sheet of paper. Leveraging best practices from various technologies can provide a blueprint or accelerator to introducing new process. While it’s attractive to want to re-engineer everything to fit exactly with your organization the ability to procure, customize, and maintain software in that environment is enormously time consuming and expensive. Instead, we recommend leveraging flexibility of existing tools and fitting your process into an already defined approach.
Where small changes and incremental improvements were once challenging in and of themselves, the intertwining of technology adoption with BPR can automate, streamline, and reorganize the fundamentals of how we undertake certain processes. Technology solutions that focus more on the outcome of a process, than the traditional and inherent steps taken to achieve that outcome allow people and teams in organizations to pursue other higher order activities and needs of their organizations.
Where in the past organizations would spend tremendous amounts of money and time to build custom solutions fit to their processes, but now with the pace of technology innovation and adoption the time has come to organize and design our processes around existing solutions.
If you believe your process is inefficient or outmoded and the excuse or defense to change is, “Yes, but that’s the way we’ve always done it” then that’s the way it’ll always be. Change requires bold leaders to assess new approaches, identify technology, and implement change.