Archive for the ‘BPM Value Proposition’ Category
Simulation and optimization are often touted as major benefits of BPM by vendors as well as industry analysts. The idea is that BPM systems (BPMS) with these capabilities enable business analysts to simulate the performance of the process through the computing power of software, identify bottlenecks, and optimize the process to best achieve the goals of the company. Even more appealing is a feature called “round-trip optimization”, which simply means that a BPMS can capture operational metrics from actual process incidents that have been completed, and then allow business analysts to use these real metrics, instead of merely good assumptions, for the simulation and optimization of the process. It is common for industrial engineers to do time-motion studies on the factory production line to improve the efficiency of work. Round-trip optimization, often simply called “round-tripping”, is the equivalent of BPMS doing time-motion studies automatically and making the metrics available for use in simulations. And then the simulation capability enables the analysts to optimize the process.
It is not surprising that analysts and some prospects are so hot on simulation, optimization and round-tripping. These are huge benefits that appear to be waiting at the finger-tips of the BPMS users. For example, in a recent blog, Jim Sinur, the leading BPM analyst at Gartner, writes as follows:
“Most people think that simulation is hard and is for those gifted with deep math skills. Today nothing could be farther from the truth. Simulation is nicely embedded in process modeling and BPM engines and pretty easy to use even for business folks. Another big fallacy is that you have to set up lots of test data to make simulation work. Most of the simulators, today, will generate appropriate instances of process based on arrival rates.”
In my humble opinion and with due respect to Jim Sinur and Gartner, this is a big overstatement. Yes, BPMS can do simulation. However the results of simulation will depend entirely on the assumptions the business analyst makes about a large number of parameters. For example the business analyst has to make assumptions about the following for each step in the process:
- The Task Time, which is the time to actually do the task at the step
- Number of resources: How many people are available to do the task, and are they dedicated or shared with other tasks
- The cost of each resource
- The probability that the step will be activated, in case it is conditional
- The probability that the user will send the case backwards, because in real life things often go backwards instead of always going forward as planned.
These assumptions have to be made for all the steps in the process. So if there are 30 steps, there are at least 150 assumptions! That is no small task that “business folks” will engage in lightly.
If that is not enough, the business analyst has to make some assumptions about the overall process. For example:
- What is the rate at which new cases or incidents are being started? It is unlikely to be a constant rate. What is the distribution of the incoming rate? And what is the best way to statistically describe this distribution?
- What state of the process do we want to optimize? Most real-world processes have working gaps. People typically work during certain time periods, such as 9 to 5. When they start working, there is likely to be a backlog from cases coming in overnight. This is the “ramp-up” phase which occurs every day for most processes. Then there is the time in the middle of the day when workers have caught up and, hopefully, things are in a steady state. Optimization is totally different at the ramp-up phase as opposed to the steady state. Which state do you want to optimize?
- What is the goal of the optimization? To reduce cost? To increase throughput? To meet some service level within specific cost constraints?
The results of the simulation will depend entirely on the quality of these assumptions. As the saying goes, “Garbage in, garbage out”. If the assumptions are erroneous, then, while one might feel good that one has optimized the process, in reality the process is sub-optimal. Making all these assumptions with reasonable accuracy is not easy, and not something that ordinary “business folks” do. It is a lot of work and requires a deep understanding of the process. And yes, some expertise in math and statistics is required.
There is another important reason why simulation and optimization is not easy and cannot be done easily by “business folks.” We know that many business processes cut across departments and each department has its own goals, requirements and cost structure. In fact the cross-departmental nature of BPM is one of its major benefits. So if you have a process that runs across departments, then it cannot be simulated and optimized by the business manager of just one department. The manager will have to collaborate with other departments, and his or her vision of the optimized process may be very different from that of the heads of the other departments. Alternatively, the manager of all the departments may engage in the simulation and optimization, but he or she will need a lot of complex input from all the departments. In all the companies that I know, this simply does not happen. Business folks delegate simulation and optimization to business analysts.
Finally, optimizing a process from a true business perspective is not simply about a lot of number crunching and discovering the optimal deployment of resources. This is a small but important part. The real requirement of optimization is sound business judgment and making choices based on what is important for the success of the business. In many cases, this requires value judgments, and software is blind to value judgments. For example, if the conclusion of the optimization exercise is that resources should be removed from particular tasks in order to reduce cost, someone has to make a value judgment about what to do with the redundant resources. Or if the response time to customers is decreased because of optimization of cost, what is the impact of that outcome on customer satisfaction and sales? Maybe it is significant and maybe it is not. The BPMS will not tell you that! Therefore, simply by doing a simulation and optimization exercise using BPMS will not produce the best outcome for companies. Simulation and optimization is at best one input to the business optimization effort.
Now let’s talk about round-trip optimization. Some will claim that I am wrong about the number of assumptions that have to be made in order to run a proper simulation. They will argue that modern day BPM systems are equipped with the capability to automatically measure these metrics from actual process data and quickly make them available for the simulation/optimization exercise. Not only that, these metrics are not assumptions; they are actual, measured results, which is even better.
I have a very simple answer as to why this is wishful thinking but devoid of reality. The most important parameter for simulation and optimization is Task Time, which is the actual time required to perform a task. One has to know the actual Task Time in order to run any kind of simulation. However, there is no BPM software that I know of that measures Task Time, because it is simply not possible to do it. Consider the challenge. Most user steps in a process use some type of electronic form. If the software has to measure how much time the user took to complete the task, what does it measure? It cannot simply measure the time the form was open, because the user could have the form open and be having a conversation with her colleague. Or the user could open the form, understand the task, close the form and be thinking or researching the task with the form closed. There is no accurate way to measure Task Time. Therefore BPM systems simply do not measure Task Time because they cannot!
Most BPM systems that tout round-tripping capabilities measure one or two parameters and feed them back in to the simulation. They have accomplished “round-tripping”, but it is far from complete. And they have not measured Task Time.
My bottom line on all of this is as follows:
- Simulation and optimization can be very useful if done properly
- It is not easy, especially if your processes are complex which they generally tend to be
- You need an experienced business analyst to do it. It cannot be done by ordinary “business folks”
- The business analyst will have to closely coordinate with business folks for their judgment and priorities
- Buyers beware of the hype about round-trip optimization.
In the comments of a subsequent blog Jim Sinur claims that 25% of clients are using simulation and optimization, up from 5% a few years ago. I think that number is way too high. The true number is probably still less than 5%. I dare to say that the true number of customers using round-tripping is closer to zero. I have not come across any in my many years in the industry. However, if any of you knows of such customer, I would like to hear about them and be proven wrong.
To illustrate how overly enthusiastic some analysts are about simulation, I quote again from Jim Sinur’ subsequent blog where he extols the benefits of simulation and writes:
For instance, Arizona was forced to cut 1.2 billion dollars out of its state budget. These kinds of cuts are also mandated in the private sector, but are hidden from public view. Quite often the cuts are made arbitrarily based on large numbers and what seem to be discretionary spends. Some of these cuts have downstream effects that are never considered or underestimated. Simulation can be used to try different options of cost cutting and resource deployment to minimize long term damage thus encouraging intelligent cuts without randomness.
This is indeed a noble thought. But I have to ask Jim, how does one go about making a socio-economic-political model of the State of Arizona and its budgets, priorities, goals and commitments to the citizens? Surely, without such a model, one cannot do simulations. What will be the cost of making such a model, and how long will it take to develop and make sure it is accurate? And which software in the world makes such models of complex entities like an entire state with millions of people? I do not know, but I can tell you that even making a model of a small company is no simple task.
Note: This blog is adapted from my recent column in BP Trends (www.bptrends.com )
After the global financial crisis started dominating the news in the Fall of 2008 I have read several press releases, blogs and statements from BPM vendors that the BPM industry is counter-cyclical. They claim that when times are tough and companies are forced to streamline or cutback their operations, the demand for BPM increases–or at least does not fall as dramatically as the rest of the market– because companies see BPM as a way to help reduce cost and become more efficient.
On the surface this optimistic argument appears to make sense. Tough economic times demand that companies cut cost while continuing to deliver adequate amounts of goods or services. BPM provide the means for companies to optimize their processes that can help achieve this goal. Beyond optimization, BPM also provides the means to automate processes that can greatly reduce cost while at the same time making the processes more efficient. It can enable companies to do more with less, which is exactly the medicine the companies need during tough economic times.
However nothing is ever as simple as it appears on the surface. The economic crisis that the world faces is going to impact the BPM industry negatively and the BPM vendors are not immune to the severe pain that the rest of the market is going through. There are several reasons for my pessimistic view that the BPM industry, especially in its current state of development, is not counter-cyclical.
First, BPM is not simply a technology that can be purchased and deployed, and suddenly companies can start reaping its benefits. Instead, BPM is a discipline – a way of conducting business and a cultural mindset. Technology is only a part of BPM, albeit an important part. Success with BPM is not the quick short-term fix that companies are seeking at times of economic crisis. BPM requires cultural change which takes time. Facing crisis, management generally does not have the luxury of time needed for BPM initiatives to bear fruit. Indeed, in a period of economic crisis, when employees are concerned about job security and their personal well-being, BPM is more likely to be perceived as a threat rather than an opportunity. This perception is not conducive to the kind of cultural change that BPM needs in order to thrive. Therefore, faced with the need for immediate action and reduced demand for goods and services, management is unlikely to invest in BPM which has a long term promise. BPM projects that are already well in progress are likely to be continued if their results are visible and significant. Mediocre projects are more likely to be canned, and new projects are less likely to be funded.
Second, BPM projects still rely heavily on professional services for the discovery, design, development, testing and deployment of processes. Because of this reliance on professional services, the deployment time for meaningful BPM projects ranges from two months to over a year depending on the complexity of processes and the amount of integration that is needed. This lag between making a decision to invest in a BPM project and when results can be ascertained, is another reason why management, facing the dire need to reduce cost today in the face of dramatically reduced demand, is less inclined to invest in new BPM projects whose payoff is months away.
Third, most organization have complex processes. They all look simple on the surface, but as one starts peeling the layers one finds the ever-present exceptions. The number of exceptions is generally proportional to the size of the organization. These exceptions are what make seemingly simple processes complex. And in many cases the processes are interlinked with each other. Simply automating one or two processes is unlikely to produce a major impact on the bottom line of a company. For tangible bottom line impact a company has to automate many processes which takes a lot of time. And management simply does not have the time or the patience when facing dire financial crises like the one we face today.
BPM cannot be rushed for the three reasons that I have listed. The financial crisis that the world is experiencing today will force management to make decisions that have a quick and short term impact. This does not bode well for increased investment in BPM and the prospects for the industry. Equally troublesome is the current state of the pure-play BPM vendors who are the driving force for innovation in the industry. Most of the pure-play BPM vendors are relatively small companies who do not have the deep pockets to survive a major downturn. In almost all cases, these companies have raised a substantial amount of venture capital investment with plans to either go public or be acquired by larger software companies seeking a slice of the BPM market. Prior to the current financial crisis Metastorm had already filed with the SEC, and Lombardi and Savvion have made noises in the past of going public. Most of these companies have already tried the M&A route and have been unable to find suitable acquirers willing to pay what they expect. Now with the state of the capital markets, the prospects of a public offering are nill. The pure-play BPM vendors will face tough times in 2009 with reduced demand for the reasons stated above, no prospects of going public and reduced prospects of acquisitions. Like many other companies, the BPM vendors will also have to take drastic actions to reduce cost and survive. Innovation will be impacted and there will be further pressure for consolidation on unfavorable terms. The larger software vendors eyeing the BPM space who have the financial wherewithal to ride the crisis will become even more dominant. While some may argue that consolidation is probably good for the industry, I have doubts about what it will do to innovation and agility for the industry.
Note: This blog is adapted from my recent column in BP Trends (www.bptrends.com )
The sub-prime driven economic meltdown has become a global crisis that is on the minds of everyone, including those of us in the business software community. Some of us, enamored by the potential and logic of software, strongly believe that the way to prevent such crisis in the future is through greater investment in software and automation that not only improves productivity but also provides a consistent application of rules. I recently heard the CEO of a major BPM vendor wishfully proclaim that if only the mortgage companies had more and better automated mortgagee processes, the world would have been saved from the sub-prime meltdown. On first blush this sounded reasonable and a good rationale for more investment in BPM. Mortgage applications are excellent candidates for automation as we all know.
Sit back and think about this, however. The sub-prime mortgage problem was created neither because the mortgage processes were inefficient nor because the rules were not consistently applied. The problem was that the rules were bad, and no BPM software that I know of has the capability to detect or correct bad rules. Bad rules, dictated by greed, produced a large number of bad mortgages. If there had been more automated and efficient mortgage processes, the result would have been even more bad mortgages and the problem would have been far worse! In fact in this case, less efficient processes would have been better in retrospect. If you automate something bad, you simply get more of it more effeciently!
This is not a slap on BPM. It is the classical problem of technology. Technology is a double-edged sword. It can do wonderful things if the rules that drive it are designed for a wonderful outcome. However, if the rules are screwed up or are trying to optimize the wrong thing, the consequence is wrong. In the case of the sub-prime mess the rules were relaxed to encourage more and riskier mortgages and that is what we ended up with. Some of you will argue that OK, but only if he processes were agile and the bad rules could have been changed early on we could have avoided the crisis. The problem is that the sub-prime crisis took years in the making, which is much longer than the lifecycle of a typical 3-4 week mortgage process. When the world woke up to the fact that the rules need to be changed the damage was done. The mortgage companies had sold millions of sub-prime mortgages to the banks, who had sold them to Wall Street, who in turn had sold them to the Chinese and Saudis!
My bottom line is that the world cannot depend solely on great technologies such as BPM and BI. Technology must be used wisely, with proper governance and good rules that optimize the outcome for all the stakeholders. The world we live in is very complex. Technologies such as BPM are relatively new. So let us use BPM and related technologies wisely on problems that we can grasp and comprehend before we use it to solve the more complex problems. And when we do solve the more complex problems, the world will have moved on to even more complex problems!
An increasing number of BPM vendors are starting to talk about offering BPM software-as-a-service (BPM SaaS). These include Appian, Lombardi, Savvion and Ultimus. Given the buzz around SaaS, this is understandable as these vendors are trying to position themselves for a growing opportunity despite the fact that the SaaS model has serious challenges from a business perspective. So the question for users of BPM is what does it really means to have BPM SaaS?
Let us first make clear what BPM SaaS is not. First, the ability to run a hosted version of one aspect of BPM is not BPM SaaS. BPM is a combination of applications used by different stakeholders. Offering only one of these in a hosted, subscription-based model is not SaaS. For example, offering process modeling tools in a subscription-based hosted model is “process modeling software-as-a-service”; it is not “BPM software-as-a-service.” However this would be a very good first step towards BPM SaaS which can offer customers as well as vendors not only the benefits of SaaS but also the experience necessary to move towards full-fledged BPM SaaS. Second, creating customer-specific automated processes and then enabling the end-users of the customer to participate in the process using a browser/Internet is also not BPM SaaS. There is nothing new about this and customers and vendors have been doing this ever since the early days of the Internet.
In my judgment BPM SaaS has to have the following characteristics as a minimum. First, it must have the ability to model and modify executable processes in a hosted application. The ability to design executable processes, in contrast to simple flow diagram, is pretty challenging. Many vendors will start by offering pre-designed process templates and then allowing users to modify them in a hosted model. This is a good way to start and over time an increasing number of parameters (rules, flows, user interfaces and integrations) can be exposed to modification by users. Second, BPM SaaS must have the ability to allow customers to integrate with their inside-the-firewall data and other applications. This is crucial because BPM deals with company’s data and interacts with other applications. Without effective integration only the very simplistic BPM processes are candidates for SaaS, and CxOs are reluctant to invest money or mindshare on simplistic processes. Integration is the Achille’s heel of BPM SaaS and solutions for this will evolve only gradually. Perhaps the best approach for BPM vendors is the emerging class of “application appliances” that leverage virtualization technology to deliver inside-the-firewall solutions on a SaaS basis. This has the potential of solving the integration problem. I will discuss it in another blog as this is a topic on its own. Third, and easiest, is the ability for end-users to participate in business processes using a browser. This is easily accomplished by most vendors and the growing using of AJAX and Rich Internet Application (RIA) technologies will make it even easier and richer for end-users. Fourth, BPM SaaS must provide a means for customers to monitor and administer their processes over the Internet. Again, with the emergence of AJAX and RIAs, this is not a challenging obstacle. And fifth, BPM SaaS must provide some web-based reporting, BI and BAM capabilities.
With these five capabilities, BPM SaaS can empower customers to design, integrate, deploy, use, administer, monitor and measure their business processes. It provides the full value-proposition of BPM in a SaaS model. Vendors will move towards this in small steps and the more agile ones, who adopt SaaS and new technologies such as RIAs and application appliances, will have the competitive and time-to-market advantage. The key challenges are modeling/design of executable processes and integration. I will use other blog posts to elaborate on likely approaches to tackle these challenges.