Archive for the ‘BPA’ Category

The Hype about Simulation and Optimization

Simulation and optimization are often touted as major benefits of BPM by vendors as well as industry analysts. The idea is that BPM systems (BPMS) with these capabilities enable business analysts to simulate the performance of the process through the computing power of software, identify bottlenecks, and optimize the process to best achieve the goals of the company. Even more appealing is a feature called “round-trip optimization”, which simply means that a BPMS can capture operational metrics from actual process incidents that have been completed, and then allow business analysts to use these real metrics, instead of merely good assumptions, for the simulation and optimization of the process. It is common for industrial engineers to do time-motion studies on the factory production line to improve the efficiency of work. Round-trip optimization, often simply called “round-tripping”, is the equivalent of BPMS doing time-motion studies automatically and making the metrics available for use in simulations. And then the simulation capability enables the analysts to optimize the process.

 

It is not surprising that analysts and some prospects are so hot on simulation, optimization and round-tripping. These are huge benefits that appear to be waiting at the finger-tips of the BPMS users.  For example, in a recent blog, Jim Sinur, the leading BPM analyst at Gartner, writes as follows:

 

“Most people think that simulation is hard and is for those gifted with deep math skills. Today nothing could be farther from the truth. Simulation is nicely embedded in process modeling and BPM engines and pretty easy to use even for business folks. Another big fallacy is that you have to set up lots of test data to make simulation work. Most of the simulators, today, will generate appropriate instances of process based on arrival rates.”

 

In my humble opinion and with due respect to Jim Sinur and Gartner, this is a big overstatement.  Yes, BPMS can do simulation. However the results of simulation will depend entirely on the assumptions the business analyst makes about a large number of parameters. For example the business analyst has to make assumptions about the following for each step in the process:

 

  • The Task Time, which is the time to actually do the task at the step
  • Number of resources: How many people are available to do the task, and are they dedicated or shared with other tasks
  • The cost of each resource
  • The probability that the step will be activated, in case it is conditional
  • The probability that the user will send the case backwards, because in real life things often go backwards instead of always going forward as planned.

 

These assumptions have to be made for all the steps in the process. So if there are 30 steps, there are at least 150 assumptions! That is no small task that “business folks” will engage in lightly.

 

If that is not enough, the business analyst has to make some assumptions about the overall process. For example:

 

  • What is the rate at which new cases or incidents are being started? It is unlikely to be a constant rate. What is the distribution of the incoming rate? And what is the best way to statistically describe this distribution?

 

  • What state of the process do we want to optimize? Most real-world processes have working gaps. People typically work during certain time periods, such as 9 to 5. When they start working, there is likely to be a backlog from cases coming in overnight. This is the “ramp-up” phase which occurs every day for most processes. Then there is the time in the middle of the day when workers have caught up and, hopefully, things are in a steady state. Optimization is totally different at the ramp-up phase as opposed to the steady state. Which state do you want to optimize?

 

  • What is the goal of the optimization? To reduce cost? To increase throughput? To meet some service level within specific cost constraints?

 

The results of the simulation will depend entirely on the quality of these assumptions. As the saying goes, “Garbage in, garbage out”. If the assumptions are erroneous, then, while one might feel good that one has optimized the process, in reality the process is sub-optimal. Making all these assumptions with reasonable accuracy is not easy, and not something that ordinary “business folks” do. It is a lot of work and requires a deep understanding of the process. And yes, some expertise in math and statistics is required.

 

There is another important reason why simulation and optimization is not easy and cannot be done easily by “business folks.” We know that many business processes cut across departments and each department has its own goals, requirements and cost structure. In fact the cross-departmental nature of BPM is one of its major benefits. So if you have a process that runs across departments, then it cannot be simulated and optimized by the business manager of just one department. The manager will have to collaborate with other departments, and his or her vision of the optimized process may be very different from that of the heads of the other departments. Alternatively, the manager of all the departments may engage in the simulation and optimization, but he or she will need a lot of complex input from all the departments. In all the companies that I know, this simply does not happen. Business folks delegate simulation and optimization to business analysts.

 

Finally, optimizing a process from a true business perspective is not simply about a lot of number crunching and discovering the optimal deployment of resources. This is a small but important part. The real requirement of optimization is sound business judgment and making choices based on what is important for the success of the business. In many cases, this requires value judgments, and software is blind to value judgments. For example, if the conclusion of the optimization exercise is that resources should be removed from particular tasks in order to reduce cost, someone has to make a value judgment about what to do with the redundant resources. Or if the response time to customers is decreased because of optimization of cost, what is the impact of that outcome on customer satisfaction and sales? Maybe it is significant and maybe it is not. The BPMS will not tell you that! Therefore, simply by doing a simulation and optimization exercise using BPMS will not produce the best outcome for companies. Simulation and optimization is at best one input to the business optimization effort.

 

Now let’s talk about round-trip optimization. Some will claim that I am wrong about the number of assumptions that have to be made in order to run a proper simulation. They will argue that modern day BPM systems are equipped with the capability to automatically measure these metrics from actual process data and quickly make them available for the simulation/optimization exercise. Not only that, these metrics are not assumptions; they are actual, measured results, which is even better.

 

I have a very simple answer as to why this is wishful thinking but devoid of reality. The most important parameter for simulation and optimization is Task Time, which is the actual time required to perform a task. One has to know the actual Task Time in order to run any kind of simulation.  However, there is no BPM software that I know of that measures Task Time, because it is simply not possible to do it. Consider the challenge. Most user steps in a process use some type of electronic form. If the software has to measure how much time the user took to complete the task, what does it measure? It cannot simply measure the time the form was open, because the user could have the form open and be having a conversation with her colleague. Or the user could open the form, understand the task, close the form and be thinking or researching the task with the form closed. There is no accurate way to measure Task Time. Therefore BPM systems simply do not measure Task Time because they cannot!

 

Most BPM systems that tout round-tripping capabilities measure one or two parameters and feed them back in to the simulation. They have accomplished “round-tripping”, but it is far from complete. And they have not measured Task Time.

 

My bottom line on all of this is as follows:

  • Simulation and optimization can be very useful if done properly
  • It is not easy, especially if your processes are complex which they generally tend to be
  • You need an experienced business analyst to do it. It cannot be done by ordinary “business folks”
  • The business analyst will have to closely coordinate with business folks for their judgment and priorities
  • Buyers beware of the hype about round-trip optimization.

 

In the comments of a subsequent blog Jim Sinur claims that 25% of clients are using simulation and optimization, up from 5% a few years ago. I think that number is way too high. The true number is probably still less than 5%. I dare to say that the true number of customers using round-tripping is closer to zero.  I have not come across any in my many years in the industry. However, if any of you knows of such customer, I would like to hear about them and be proven wrong.

 

To illustrate how overly enthusiastic some analysts are about simulation, I quote again from Jim Sinur’ subsequent blog where he extols the benefits of simulation and writes:

 

For instance, Arizona was forced to cut 1.2 billion dollars out of its state budget. These kinds of cuts are also mandated in the private sector, but are hidden from public view. Quite often the cuts are made arbitrarily based on large numbers and what seem to be discretionary spends. Some of these cuts have downstream effects that are never considered or underestimated. Simulation can be used to try different options of cost cutting and resource deployment to minimize long term damage thus encouraging intelligent cuts without randomness.

 

This is indeed a noble thought. But I have to ask Jim, how does one go about making a socio-economic-political model of the State of Arizona and its budgets, priorities, goals and commitments to the citizens? Surely, without such a model, one cannot do simulations. What will be the cost of making such a model, and how long will it take to develop and make sure it is accurate? And which software in the world makes such models of complex entities like an entire state with millions of people? I do not know, but I can tell you that even making a model of a small company is no simple task.

 

 

Note: This blog is adapted from my recent column in BP Trends (www.bptrends.com )

 

 

 

 

 

 
 
 
 

 

 

Advertisements

Business Process Management needs Business Process Analysts rather than Chief Process Officer

About four years ago when the importance of BPM and process management spread to a wider group companies seeking process excellence, the role of the Chief Process Officer (CPO) emerged as an important senior position in organizations. I was among the many proponents of this new role. The rational was simple. Processes are key to the performance of organizations. BPM enables companies to capture the best practices for processes and use the power of software to execute them consistently while at the same time providing transparency, accountability and visibility. However BPM technology is still relatively complex, there are cultural changes that accompany its adoption which can have a major impact on its success or failure, and this change needs to be marshaled from a high level in the organization. The position of the CPO reporting to the CEO was designed to accomplish this.

At first blush the role of a CPO looks like a good idea. However some further thinking, evaluating my own experiences and recent discussions with BPM customers has made me rethink this approach.  My thinking has evolved because of the following considerations:

i.                     If BPM is as important to an organization the top management of the company including the CEO and functional managers must take the lead and become champions and true believers in a process-focused organization. This important responsibility should be delegated to a new role of a CPO.

 

ii.                   Functional managers must become owners of processes in their departments. Their performance and the performance of their department must be measured by the effectiveness of their processes. Functional managers have the most domain expertise in their area and are in the best position to know the business process requirements that will lead to success.

 

iii.                  If a new role of a CPO is created it will add another department in the organization. This will not only add additional cost, but there is a strong probability that friction will develop between the office of the CPO and other departments in the organization. While the CPO owns processes, or facilitates the development of processes, the BPM system (BPMS) still relies on the IT infrastructure that is owned by IT, and much of the technical expertise to make the BPMS technically successful also most likely resides in IT. Likewise, functional managers have the responsibility for processes in their areas and also the domain expertise to make these processes effective. It does not make sense to make the office of the CPO responsible for processes in different functional areas, and yet have management of the functional areas be measured and rewarded by processes which they do not fully control.

For these reasons I believe that functional managers must own the processes in their departments and must be evaluated and rewarded based on the effectiveness of these processes as measured by KPIs that the organization agrees upon. However, while BPM systems are becoming more powerful, their underlying technology is also becoming more complex.  Functional managers need help from IT to cope with the technical complexity. IT on the other hand does not have the business knowledge necessary to understand the complexities of business. So what the organization needs is a new breed of business analysts who also have sufficient process analysis skills. Let’s call them Business Process Analysts, or BPAs. The BPAs should have the following skills and attributes:

  1.         BPAs should belong to the IT organization and report to the CIO, but they should be    assigned to work with specific functional managers for automating business processes that are vital to the latter’s department.

 

ii.                   BPAs must become the bridge between business (functional mangers) and IT. They should know enough about business that they can understand and empathize with the process requirements of the functional areas they are assigned to. They must also have strong knowledge of IT insofar as their ability to understand the benefits and drawbacks of various technological choices. Empowered with the knowledge of business as well as IT, the BPAs become facilitators of business processes owned by the functional managers. They should be rewarded for the success and effectiveness of the processes they facilitate

 

iii.                  The BPAs serve as the bridge between the functional process owners and IT. As the bridge, they must diplomatically play the role of champions of each side and the developers of compromise in case of conflict between the two.

 

iv.                 BPAs must be trained not only in the modeling of business processes, which is really the activity for documenting business requirements, but also working with the functional managers to optimize resource allocation and the more complex art of process optimization using modern BPM tools.

 

The Business Process Analyst will become a vital role in an organization and in the success of BPM. An educational background that combines business and IT skills will provide an excellent foundation for Business Process Analysts to be successful.