We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.
Start a Discussion
BPM
user-pic

Can your processes be too good?

Vote 0 Votes
An interesting point from this Keith Swenson blog, which concludes, "Too narrow a focus on efficiency will make an organization fragile."  What do you think?

19 Replies

| Add a Reply
  • I already can't agree with the first sentence of the article 'The goal of process management is to improve process'.

    And Process Improvement is about Managing your processes?

  • Like I wrote in a similar article about standardization:

    http://bpmredux.wordpress.com/2013/04/15/what-goes-around-comes-around-part-2-is-standardization-still-a-valid-strategy/

    Creating standard processes also removes the ability to innovate because the goal for most is to seek and destroy variance in favour of the majority. And when an organization reaches the plateau of process improvement and has created volumes of standard processes which can be reused everywhere, what then?

    Processes aren't just driven for efficiency, more often than not they're driven to release people and capacity elsewhere. If this, coupled with efficiency, become singular goals for creating processes then yes, it'll make the enterprise fragile and unable to response to a rapid change.

    You can have too much of a good thing and it's certainly applicable to process as well.

    There’s a tendency to rush into building 'perfect' and efficient processes without actually understanding everything as a whole, process improvement will just become another silo in itself as a result.

  • We have a fear in this socially-connected world of anything that appears to be rigorous or structured. But there are times when structure makes sense, especially when it leads to truly measurable efficiency. Efficiency as an end goal is not a bad thing, since it can provide spare resources to focus on innovation. Being rigid and unwilling to change to meet new market conditions is obviously bad. Having an efficient, quality process to get necessary work done is quite another.

    For example, what benefit is there it assessing an insurance claim differently every time? Very little, especially if you still need the end result of all the boxes checked and internal controls satisfied. The corporate world is regulated and structured processes allow regulations to be met consistently and efficiently.

    Apply strict processes in parts of the business that don't need them though, and you are likely to see a company and aspiring employees being stifled. The independent parts will start to work together badly, and the organization will indeed become fragile. We have seen this time and again with old IT, where enterprise software can not adapt and ends up dragging the company down with it.

    Like anything, a narrow focus on one process in an organization can mean you miss the bigger picture. It can also help to make a broken piece of the puzzle fit back into the organizational picture better than before.

  • This question, like Keith's post, covers a lot of ground -- so much so that I'm afraid some of the key points are at risk of being lost. So here are a few, hopefully clarifying, perspectives from where I sit (and remember, you asked!):

    1) it is possible to work to improve your processes to a point where you begin to experience diminishing returns -- in other words, you spend more to further the improvements than you save by making those improvements themselves.

    2) it is possible to focus so intently on the process that you forget why -- that is to say, for what business reason -- you have embarked on that path to begin with.

    3) it is possible to focus so intently on improving the process at hand that you forget to look at that process in the overall business context and miss opportunities to innovate on a grander scale.

    I'm not sure that these scenarios lead to processes being "too good," but they do at least speak to some of the key factors I regularly encounter when it comes to figuring out where to focus and how to determine when "good enough is good enough."

  • Indeed, only focusing on efficiency doesn't make sense.

    You might end up in being very efficient in doing useless things. So, efficiency only is useful when it comes after effectiveness (like in the dictionary).

    So every process initiative should start with 'what does this process promise?' And that makes clear that a process is just a means.

    A means to deliver a result(service, product, solved problem) . So it is important to know what the result and the attached goals are. And then you can decide how this process must be managed to deliver that promise.

    So, how can you do better if you don't know what is good? Start with communicating 'the good' so everybody can contribute.

    And if you can create more efficiency without becoming less good, I won't complain. But in the end I don't want to improve, I want to do things well.

    And I've seen a lot projects where 'well' isn't made clear.

    And yes then processes can become too good.....in useless things.

  • Steve, the three points are very helpful, which I can summarize as (1) diminishing returns, (2) lose sight of overall goal, and (3) prevent expansion to other newer goals.

    The issue to me is that it is impossible to directly measure what you want: "potential for future business success and satisfied customers." We make do with proxies for that that are imperfect at best.

    I am reminded again of help desks that measure duration of incident as an indication of efficiency. If you slavishly try to reduce the time spent on each call, you end up with the support team closing calls prematurely if it looks like the caller is taking a long time, and opening a new incident later when that customer calls back. This effectively reduces the average duration of an incident, and increases the number of incidents, but it does not make the customer happier.

    That example is pretty transparent, and most support organizations avoid this problem today, but there are similar but less obvious problems happening at all levels of the organization. The problem is simply that it is impossible to measure directly what you want, like the future potential for business.

    Phil, Good reply, but I hope seriously this is not viewed as fear of anything that appears to be rigorous or structured. Instead, it is fear of inappropriate structure, which is often justified. And structure does not equal rigor. I like your analogy to the puzzle, but imagine a situation where two puzzles got jumbled together. Pieces might fit perfectly, but the picture is nonsense.

  • Keith's discussion on the dangers of a narrow focus on efficiency, is worth reading. I'm reminded of the aphorism about "speeding up without being sure of the direction", which captures the issue of context. But what about "brittleness"?

    There are really two problems with a narrow focus on efficiency.

    First: the question of direction. Since the revolutions in the 90's and process re-engineering, efficiency gains are now only to be had in small increments. The big wins are about adjusting business models -- in other words which concern changes in direction.

    Second: What about "brittleness", which can be a characteristic of a system? Brittleness is the opposite of "robustness". A system is robust if system states and outputs vary proportionally with the violation of system assumptions. So a brittle system is one where system response to shocks is unexpectedly large -- likely the system collapses.

    Why should a narrow focus on efficiency produce brittleness? Because the construction of a non-brittle, i.e. robust, system requires sufficient modeling and research of non-standard conditions and the construction of system artifacts sufficient to accommodating those conditions. This challenge is in part a budgeting problem. It's easy to build a straight-through workflow; it's more expensive in money and expertise to build a robust system.

    In summary, a narrow focus on process efficiency is risky because (1) one must ensure the process is evolving in the right direction and (2) one must spend the resources necessary to build a robust system. Failure on both counts will ensure delivery of the wrong system, one characterized by brittleness.

  • The thing about process is that it isn't the solution to every problem. Of course, here in the BPM community, the domain of problems we focus on is exactly that set which is addressable via process. But within our own companies, I'd be willing to bet that process generally takes a back seat to innovation. Software development is a lot more about creativity than it is about process (with apologies to those of you who have spent too much of your life focused on development "methodologies").

    Of course, often enough the innovation is in the process itself. For example, few things are more procedure-bound than preparing an aircraft for departure. And yet, Southwest Airlines turns around planes between flights faster than anybody else: that's innovation wrapped in process.

    So in the end, it's not that your processes can be "too good", but rather, that your focus on process can steal attention from other important factors in running your business.

  • What this article speaks to is how bad management by metrics is for firms. With all the data provided by process automation and BI tools, the tendency for managers to abuse this data as a management crutch is pretty widespread. Management needs to have standardization, efficiency, and metrics available at all different levels but they also have to actually manage their people and responsibilities and not just look at 'throughput' or 'billable hours' or 'incidents closed' as their heavily weighted decision inputs. Processes can't be too good but they can definitely be really bad. Management's implementation and use of processes is the differentiation.

  • My thoughts on Keith's blog were recently (last night) published on our blog:
    http://www.bp-3.com/blogs/2013/05/a-process-too-good-or-too-good-to-be-true/

    key part of my thoughts:

    Eliminating truly wasteful effort is typically a good thing. But which things are waste, and which things are “cost of doing business”, and which things are “the secret sauce” – these are the insights that separate good from bad, and great from good.

    Were the processes too good? Not in my book. In my book, the efforts they took to cut costs optimized on the wrong metric. Maybe a better label would be – “Processes too good to be true.”

    Short answer: this isn't a failure of process, it is a failure of leadership/management - the humans, the knowledge workers (the C-level/management teams)

  • As always the discussion about process is stuck on a too low level. No one asks: 'How do we know if a process is good?' Good processes affect a multi-dimensional space and that both short-term and long-term. Plus there is a people dimension in skill and motivation. It is a matter of fact the BPM and BPMS mostly focus on the short-term cost aspects today. This is I think Keith's core message and I can't tell how often I have written about it in the last ten years.

    Process quality and cost are opposed and the ideal is to find the sweet spot where the customer leaves happy with the minimal effort necessary. That is obviously different for each customer and therefore standardization creates as much waste as a lack of process.

    Eliminating waste for efficiency should thus be a one-off decision that happens at process execution time by the service staff. When you make these decision beforehand you are creating as much negative impact as positive ones. You can't make a process more efficient and more effective at the same time upfront because of the multi-dimensional space. Efficiency or better operational targets (like cost and SLAs) is just one dimension as are 'outside-in' or better customer outcomes. But there is also process goals and handovers rather than process flows.

    Clearly process standardization is the oversimplification of management principles and points to a lack of real management skill. The claim that there is a business benefit to rigorous process structure is totally unproven. It is purely justifyable by short-term cost reduction and nothing else. Quality improvement is usually measured by the process variation but not how satisfied customers are with the process. Standard processes dissatisfy the majority of customers because they can't be treated the way they expect it. Internal standard processes can be forced upon employees, but because of the negative emotional impact it causes a disconnect that kills the initiative to improve processes.

    I have discussed the Nassim Taleb's book Antifragile in a three post series on my blog and its relationship to BPM as another perspective on my previous (2010) posts on adaptive complex systems. http://isismjpucher.wordpress.com/2013/01/05/naive-intervention-part-1-from-antifragile-to-models-behaving-badly/

    I have pointed out in 2009 that natural dynamics can't be controlled unless one exerts substantial effort. Rigidizing and standardizing processes creates operational stresses that eventually cause the system to fail, hence the can be seen as brittle or fragile. But it truly means that to keep the processes stable before they actually fail, the business has to spend money and resources (governance) to keep the processes stable against the natural pressure of change. Rigid buildings will break in an earthquake but the flexible ones bend with the outside energy and absorb it.

    What has to be maintained by management is not the flow-diagrams, but the definition of customer outcomes, process goals and handovers, and operational targets. It is four dimensions plus the added time dimension of necessary aligment with outside change. The force that keeps the process in place and makes it work is the skill and attitude of the process performer (should be a knowledge worker). It is quite clear that the standard process structure between the strategy and the outcome makes a business very fragile. Cheaper and less staff creates more fragility.

    Adaptive processes are therefore not about being chaotic, creative, un-structured or non-compliant but about performers creating a process that fulfills all goals, follows all compliance rules and can change at the drop of a hat — and can be reused for a better outcome NEXT TIME. A process or any parts of it might turnout to be quite rigid for one reason or the other. Performers can be restricted from modifying it. Adaptive is more about how the process is created than how much people can change it at runtime.

    What are the definitions that make up an adaptive process:

    1) goals (business view)
    2) outcomes (customer view)
    3) skill or resources (capability view)
    4) work (task types, dependencies, checklists)
    5) data (forms and silo interface view)
    6) rules (for data and content, resources, compliance)
    7) content (inbound, outbound, social, email, rules)

    Orthodox BPM flow diagrams, BPMN or even the CMMN case management notation describe at best 20% of that. BPM as a methodology has failed to consider these just the same.

  • Good processes are ones that can change as required.....?

    Max you right those 7 definitions are a must and you likely right that BPMN and CMMN not up to the job BUT there are graphical representation capabilities reflecting these requirements for such BPM thinking. The fact most BPM supporting technologies hyped by vendors can't deliver in no way diminishes the broad spectrum of the BPM discipline which started long before the BPM tag.... It is the vendors that need to "catch up".....?

Add a Reply

Recently Commented On

Monthly Archives

Blogs

ADVERTISEMENT