The Hype about Simulation and Optimization

Simulation and optimization are often touted as major benefits of BPM by vendors as well as industry analysts. The idea is that BPM systems (BPMS) with these capabilities enable business analysts to simulate the performance of the process through the computing power of software, identify bottlenecks, and optimize the process to best achieve the goals of the company. Even more appealing is a feature called “round-trip optimization”, which simply means that a BPMS can capture operational metrics from actual process incidents that have been completed, and then allow business analysts to use these real metrics, instead of merely good assumptions, for the simulation and optimization of the process. It is common for industrial engineers to do time-motion studies on the factory production line to improve the efficiency of work. Round-trip optimization, often simply called “round-tripping”, is the equivalent of BPMS doing time-motion studies automatically and making the metrics available for use in simulations. And then the simulation capability enables the analysts to optimize the process.


It is not surprising that analysts and some prospects are so hot on simulation, optimization and round-tripping. These are huge benefits that appear to be waiting at the finger-tips of the BPMS users.  For example, in a recent blog, Jim Sinur, the leading BPM analyst at Gartner, writes as follows:


“Most people think that simulation is hard and is for those gifted with deep math skills. Today nothing could be farther from the truth. Simulation is nicely embedded in process modeling and BPM engines and pretty easy to use even for business folks. Another big fallacy is that you have to set up lots of test data to make simulation work. Most of the simulators, today, will generate appropriate instances of process based on arrival rates.”


In my humble opinion and with due respect to Jim Sinur and Gartner, this is a big overstatement.  Yes, BPMS can do simulation. However the results of simulation will depend entirely on the assumptions the business analyst makes about a large number of parameters. For example the business analyst has to make assumptions about the following for each step in the process:


  • The Task Time, which is the time to actually do the task at the step
  • Number of resources: How many people are available to do the task, and are they dedicated or shared with other tasks
  • The cost of each resource
  • The probability that the step will be activated, in case it is conditional
  • The probability that the user will send the case backwards, because in real life things often go backwards instead of always going forward as planned.


These assumptions have to be made for all the steps in the process. So if there are 30 steps, there are at least 150 assumptions! That is no small task that “business folks” will engage in lightly.


If that is not enough, the business analyst has to make some assumptions about the overall process. For example:


  • What is the rate at which new cases or incidents are being started? It is unlikely to be a constant rate. What is the distribution of the incoming rate? And what is the best way to statistically describe this distribution?


  • What state of the process do we want to optimize? Most real-world processes have working gaps. People typically work during certain time periods, such as 9 to 5. When they start working, there is likely to be a backlog from cases coming in overnight. This is the “ramp-up” phase which occurs every day for most processes. Then there is the time in the middle of the day when workers have caught up and, hopefully, things are in a steady state. Optimization is totally different at the ramp-up phase as opposed to the steady state. Which state do you want to optimize?


  • What is the goal of the optimization? To reduce cost? To increase throughput? To meet some service level within specific cost constraints?


The results of the simulation will depend entirely on the quality of these assumptions. As the saying goes, “Garbage in, garbage out”. If the assumptions are erroneous, then, while one might feel good that one has optimized the process, in reality the process is sub-optimal. Making all these assumptions with reasonable accuracy is not easy, and not something that ordinary “business folks” do. It is a lot of work and requires a deep understanding of the process. And yes, some expertise in math and statistics is required.


There is another important reason why simulation and optimization is not easy and cannot be done easily by “business folks.” We know that many business processes cut across departments and each department has its own goals, requirements and cost structure. In fact the cross-departmental nature of BPM is one of its major benefits. So if you have a process that runs across departments, then it cannot be simulated and optimized by the business manager of just one department. The manager will have to collaborate with other departments, and his or her vision of the optimized process may be very different from that of the heads of the other departments. Alternatively, the manager of all the departments may engage in the simulation and optimization, but he or she will need a lot of complex input from all the departments. In all the companies that I know, this simply does not happen. Business folks delegate simulation and optimization to business analysts.


Finally, optimizing a process from a true business perspective is not simply about a lot of number crunching and discovering the optimal deployment of resources. This is a small but important part. The real requirement of optimization is sound business judgment and making choices based on what is important for the success of the business. In many cases, this requires value judgments, and software is blind to value judgments. For example, if the conclusion of the optimization exercise is that resources should be removed from particular tasks in order to reduce cost, someone has to make a value judgment about what to do with the redundant resources. Or if the response time to customers is decreased because of optimization of cost, what is the impact of that outcome on customer satisfaction and sales? Maybe it is significant and maybe it is not. The BPMS will not tell you that! Therefore, simply by doing a simulation and optimization exercise using BPMS will not produce the best outcome for companies. Simulation and optimization is at best one input to the business optimization effort.


Now let’s talk about round-trip optimization. Some will claim that I am wrong about the number of assumptions that have to be made in order to run a proper simulation. They will argue that modern day BPM systems are equipped with the capability to automatically measure these metrics from actual process data and quickly make them available for the simulation/optimization exercise. Not only that, these metrics are not assumptions; they are actual, measured results, which is even better.


I have a very simple answer as to why this is wishful thinking but devoid of reality. The most important parameter for simulation and optimization is Task Time, which is the actual time required to perform a task. One has to know the actual Task Time in order to run any kind of simulation.  However, there is no BPM software that I know of that measures Task Time, because it is simply not possible to do it. Consider the challenge. Most user steps in a process use some type of electronic form. If the software has to measure how much time the user took to complete the task, what does it measure? It cannot simply measure the time the form was open, because the user could have the form open and be having a conversation with her colleague. Or the user could open the form, understand the task, close the form and be thinking or researching the task with the form closed. There is no accurate way to measure Task Time. Therefore BPM systems simply do not measure Task Time because they cannot!


Most BPM systems that tout round-tripping capabilities measure one or two parameters and feed them back in to the simulation. They have accomplished “round-tripping”, but it is far from complete. And they have not measured Task Time.


My bottom line on all of this is as follows:

  • Simulation and optimization can be very useful if done properly
  • It is not easy, especially if your processes are complex which they generally tend to be
  • You need an experienced business analyst to do it. It cannot be done by ordinary “business folks”
  • The business analyst will have to closely coordinate with business folks for their judgment and priorities
  • Buyers beware of the hype about round-trip optimization.


In the comments of a subsequent blog Jim Sinur claims that 25% of clients are using simulation and optimization, up from 5% a few years ago. I think that number is way too high. The true number is probably still less than 5%. I dare to say that the true number of customers using round-tripping is closer to zero.  I have not come across any in my many years in the industry. However, if any of you knows of such customer, I would like to hear about them and be proven wrong.


To illustrate how overly enthusiastic some analysts are about simulation, I quote again from Jim Sinur’ subsequent blog where he extols the benefits of simulation and writes:


For instance, Arizona was forced to cut 1.2 billion dollars out of its state budget. These kinds of cuts are also mandated in the private sector, but are hidden from public view. Quite often the cuts are made arbitrarily based on large numbers and what seem to be discretionary spends. Some of these cuts have downstream effects that are never considered or underestimated. Simulation can be used to try different options of cost cutting and resource deployment to minimize long term damage thus encouraging intelligent cuts without randomness.


This is indeed a noble thought. But I have to ask Jim, how does one go about making a socio-economic-political model of the State of Arizona and its budgets, priorities, goals and commitments to the citizens? Surely, without such a model, one cannot do simulations. What will be the cost of making such a model, and how long will it take to develop and make sure it is accurate? And which software in the world makes such models of complex entities like an entire state with millions of people? I do not know, but I can tell you that even making a model of a small company is no simple task.



Note: This blog is adapted from my recent column in BP Trends ( )










16 comments so far

  1. Jim Sinur on

    I think you might be surprised at some of the case studies that are available through Simudyne and IBM. They think the simulation market is over a billion in size and the case studies they have are quite impressive. If you discount the fact that they are in the vendor communoty, there is still something worth noting going on here.

    I think there is a shift going on in the application of optimization and simulation that you and are only seeing a bit of at the moment. We shall see going forward 🙂

    • Rashid Khan on

      Thanks Jim for your comments. As you know I was talking specifically about BPM vendors and the use of simulation and optimization relating specifically to BPM. I do not know about Simudyne; Gartner does not include them in the BPM MQ. About IBM, I would be happy to see their case studies and see how they address the issues/problems that I specifically mention in my blog. Furthermore, I was mostly concerned about the hype associated with simulation/optimization with regards to the leading BPM vendors that you include in the BPM MQ.

  2. […] The Hype about Simulation and Optimization « Leadership BPM Rashid Khan on how simulation is being oversold by vendors and analysts. I agree with him on this for the most part; his discussion of how BPMS do not accurately measure task time is identical to a conversation that I was having with a client yesterday about how their new BPMS is not going to be a personnel time-tracking system. (tags: bpm simulation) Posted by Sandy Kemsley on Tuesday, March 3, 2009, at 12:01 pm. Filed under Links. Follow any responses to this post with its comments RSS feed. You can post a comment or trackback from your blog. […]

  3. John Hoogland on

    I would like to add a little more to the discussion, because I like this critical approach.
    First of all, there are more reasons why simulation is hard to implement.
    1. Round trip optimazation is in the current BPMS tools always restricted to one process type. You loose the effect that one process tpye has on others. Resources often operate in multiple process types.
    2. The process model in execution these days is more agile then in the past. In modern BPMS tools excepetions can be supported that are not explicitely modelled. As a consequence you cannot use the execution model as the basis for the simulation model
    3. Most tools are not able to detect the stochastical functions that adequately defines the statiscal variance in process time, arrival patterns and throughput time.

    Many of these limitations by the way can be overcome by process mining, a technology developped by prof van der Aalst of the University of Eindhoven. Process mining detects both the explicit executed process models (regardless of the released process design) and detects statistical data and functions as well. It makes simulation a lot more realistic.

  4. Scott Francis on

    @Hoogland: w.r.t. #1: Lombardi purports to do this roundtripping across processes (you can do an organizational heat map showing how much particular people in the organization are working on processes that are modeled in Lombardi’s software). However, even so you only have data about the processes that are modeled in a BPMS! The water-cooler process is not yet modeled 🙂
    w.r.t. #2: Agree with the premise, not sure the conclusion follows- at least in my experience the exceptions that are handled in a way that is not modeled are rare and wouldn’t be statistically significant…
    w.r.t. #3: I couldn’t agree more. Not only are the supplied functions inadequate, the vendors don’t provide hooks for third parties to develop their own stochastic functions.

    re: process mining – is there a product on the market that does this or is it mainly research and academic study at this point? (I’ve read some of the background on it, but I haven’t seen a product I can install yet)

  5. Scott Francis on

    Mr. Khan – very interesting post, appreciate your sharing these thoughts and also not taking the politically correct viewpoint (the PC viewpoint in our market, I think, is to say that simulation is wonderful 😉

    I would point out though that I don’t think task time is more important than process throughput, quality, or process execution time. I think we’re past the old industrial engineering / operations management approach of improving a single worker’s efficiency – that tends to squeeze the balloon from one worker or team to another. A great example of this is how companies handle expense accounting. At some point, the finance group will decide that employees need to tape each receipt to a piece of paper just so, paper clip them together, fill out a form, print a bunch of stuff out, and get some signatures. All of this, so that all accounting has to do is press a button rather than actually examine the expenses to approve them or note discrepancies. However, we’ve now moved a relatively inexpensive task (the folks who do data entry for accounting) out to the people who sell product, deploy product, and service customers (the rest of the company), and thereby we’ve actually increased our total cost, while reducing our measurable hard cost. We have to be careful to avoid just squeezing the balloon in the one place, and look at the goals of the process for purposes of simulation, optimization, and round-tripping…

    (i have some more thoughts, and will put those in a blog entry)

  6. john hoogland on

    Scott, there is commercial product available, even in a SAAS model where you can pay per analysis. Please contact me for details, I don’t want to use this platform for commercial purposes. Mail me at

  7. […] Khan, formerly of Ultimus, recently posted complaining about “The Hype about Simulation and Optimization“.  He has a clear vendor-perspective on this subject – he understands why prospects and […]

  8. Alfonso Marbot on

    Dear Mr Khan,
    I agree with most of your conclusions but in some areas I see other premises than listed causing the problem. Staying in the context of BPMS I don’t know any BMPS vendor understanding simulation properly and implementing it correctly. Unfortunately I have a tendency to believe that this is also true for the analyst world and even worse the eco system “vendor analyst” is in a race deploying and justifying a set of useless features and functions.
    In your blog you mention “garbage in … garbage out” and you argue why in most of the cases the input of these simulation models is garbage. I think we can get reasonable good (OK task time is always a problem) data out of existing systems but the real issue is complexity. Everybody with serious mathematical background knows in the world of simulation the quality of output is questionable if the number of input parameters is big. From my experience and trying to be specific I question every outcome of any BPMS vendor simulation model if the process has more than 5 instances (even with good data).
    It appears to me that many analysts and vendors product managers are like big kids. It is fun playing with these simulation gadgets and being a little bit sarcastic the BPM vendor might have realized that it is easier to sell toys than good BPM functions.
    At the end the customer has the final call. He has to pay for the feature in the BPM product and for the learning or damage caused by deployments using improper simulation assumptions.
    Having said all above I’m a great fan of simulation if applied properly.
    Thanks for your thoughts …..Alfonso

  9. […] « Leadership BPM March 10, 2009 Just read  Sandy Kemsley’s  comments about Rashid Khan’s blog about simulation & optimzation […]

  10. Neeli Basanth on

    Dear Khan,
    I would like to add another dimension to simulation. How will the results of simulation be used? The result could be identification of a resource bottleneck or change in process. The question is: can these two decisions be taken at individual process level? If so, then how the resource allocation affects the other processes in the enterprise, certainly the enterprise will not have one process.

    For effective use of simulation, it should be possible to be able to simulate on a set (group) of processes taking into consideration a balanced scorecard type of approach.

  11. workflow on

    I to would like to add another aspect of the simulation model. For each step if user X has N things in hist task queue in which order does he or she process them. Last in First Out. First in last out. Or the most closely real work scenario where by they perform a very quick analysis of what tasks they have and provide some sort of heuristic algorithm to decide what they should be working on.

    Further more each step could have a distribution (Chi, Poisson, Binomial, who knows) as to how quickly it can be completed. It is very difficult for any set of logic to determine given a data set which of these distribution models would provide the best fit.

  12. […] Management, Optimization, Simulation Rashid N. Khan has a post on the use of simulation in BPM here.  Khan raises valid concerns on the usage and results of simulation. I tend to agree with his […]

  13. SEO Support on

    Super nice post. Keep more coming like this 😉

  14. Gustavo Gomez on

    great post, couldn’t agree more with you in what it really takes to deliver useful simulation capabilities.
    There’s only one think I’d like to add: the best simulation capabilities are in the human brain, not in a software program.
    What I mean is that in 99.99% of the BPM projects if you provide smart people with high quality data (although imperfect as you post correctly states), these people tend to make good decisions.
    Only rarely and in very sophisticated/odd scenarios people would rely on the simulation to replace their data/business analysis capabilities. Use humans for what they are good at.
    Leave the rest to computers.

  15. […] Currently, simulation requires a skilled simulation guru to be able to set it up and analyze the simulation module correctly. Most customers don’t use simulation or don’t use it correctly (see The Hype about Simulation and Optimization) […]

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: