We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.
Start a Discussion

Should processes be fixed before implementing a BPMS?

Vote 1 Vote
As Niel Nickolaisen says at SearchCIO, The best BPM implementations focus on keeping things simple, "My 'fix processes before implementing technology' attitude has influenced how I view such things as BPM systems." So do you think processes should be fixed before implementing a BPM system?

21 Replies

| Add a Reply
  • When you define BPMS as workflow, definitely yes.

    If you see a BPMS as a set of supporting tools, then no. Then a BPMS might contain a lot of nice things to help you understand your processes and find ways to improve them. And I think the solution will not be the BPMS of type 1 most of the time.

    Maybe it's about the "s" here. Does it stand for System or Suite?

  • There are some parallel opportunities here for BPMS to help with the fix upfront BUT there's a faster way to get to the end game for immediate ROI;

    1. Using Desktop Analytics to capture the full workflow of user activity MUST be the first thing to start with. Why would you not want to know what your users are doing - 24x7
    2. Use Desktop Automation to Improve the actual users processes at the coalface (desktop) -You see an immediate ROI and also have the connection to the BPM systems from the real-time actions.

    Both of these become massive contributors to successful BPM because the heart of success is what the user does and User Process Improvement MUST COME FIRST!

  • Garbage in :: Garbage out

    Why implement a BPMS and populate it with broken processes and inefficiency ?

    Sure, a BPM tool can help you identify where things can be going wrong but from a pragmatic point of view the best practice is to improve what you have within your own capabilities and then use the system to spot further opportunities.

    And if you haven't or can't improve what you have, and you know the processes are broken or inefficient, then you shouldn't be wasting your money on automation and a BPMS until your own BPM and process understanding matures.

    Keep it simple like Niel says....

  • Without reading the post, I would say yes.

    Otherwise you are "paving the cow path" which may lead to automated sub-optimization. a business process full of "non-value added" (think - Six Sigma) still has waste. Automating it just hides it more which makes it arguably more difficult to repair later.

    This principle applies to all layers of an EA. I've been involved with efforts to port/re-write systems into new technology. Unless requirements are met and designs optimized you will result in the same architectural challenges.

    Said differently: pick the movie you hate the most. Whether you watch it on a 20 year old TV, a shiny new HDTV, or in theater with 3D, the movie still sucks.

  • I would start with the existing process - remove non value add activities from your process; have a computer perform an activity that a human doesn’t need to perform; and have a person do what a computer cannot. Perfection is not a realistic goal.

  • Right, this definately demands that I put my Process TestLab hat on.

    Fixing processes comes in two parts. One deals with logical errors in the sense of processes or parts of them not working, not delivering the required results or functionalities. The other aspect deals with the improvement of existing and basically usuable processes.

    What we tend to see is that broken or unusable processes are fixed to the extent that they can be implemented. Curiously, this usually comes under the headline of business process improvement even though you're not really improving the business process but only making the BPMS deployable.

    The downside is that (technical) BPMS restrictions either lead to a change in the minimally repaired business processes (resulting in BPMS being blamed for performing below expectations) or in perpetuating the quick-fixes.

    Unfortunately [and contrary to the expectations the BPMS industry has created], we are still unable to conduct the process equivalent of open heart surgery. Once implemented, most processes have a smaller change and update rate than processes not tied to a system. This may in part be down to the BPMS but it's also down to the mindset and methodology employed.

    In practice, most still regard the 'process' from design to implementation as a one way street with a definite and defined ending: The release of the BPMS supported process.

    Were you to regard it as a life CYCLE and continuous task instead you could probably improve/fix a lot of process aspects after an initial implementation.

    But again, looking at real life what we see is that the technical aspects (need a new button, can you change the layout of the screen) are changed and tend to get confused with the business process requirements.

    The long and short of it all is that the garbage-in garbage-out argument may in some cases be valid and in others less so. This unpredictable state requires a thorough assessment of the risks and consequences involved in fixing problems before or after process implementation.

    Did I mention the Process TestLab? Uups, visit taraneon.com for more, as this forum question is exactly what we deal with on a daily basis


  • My answer is "no".

    There are basically two options: either we optimize the process first and then implement it in BPMS or we gain a control over a process first with the help of BPMS and then improve it as long and as frequent as we wish.

    From my experience the first approach doesn't work well because the end-to-end process visibility is too poor. Besides, it tends to create the expectation that one heroic effort will do the job.

    We follow the second approach most of the time, meaning that process optimizations is not considered as a pre-requisite. Yet we do optimize the process here and there during the implementation on the condition that all stakeholders agree on it and there is no significant risk at technology or human side.

    Is it "paving the cow path"? I don't think so - the path we pave isn't made of concrete, it's a lightweight structure that can be rearranged with little efforts.

    • Anatoly,
      just to put a philosophical touch to that: Few processes can actually be optimized in an absolute sense. The more dynamic the process environment, the more the need to continuously re-balance the processes.

      The only reason I mention this is that optimization is often regarded as something you achieve and will stay with you. Not so.


      • Thomas

        I meant exactly this - "improve as long and as frequent as we wish".

        The same with "automation" - few processes can be automated once and forever so I hate the word "automation" applied to processes.

        Processes and/or our view of processes are volatile by nature - after all this is why we differentiate them from algorithms. Remember prof. Wirth's classic "Algorithms and Data Structures"? We moved further since then to "Processes, Algorithms and Data Structures".

        If It Isn't Agile It Ain't BPM.

  • This is a great question. If you don't start with what you have, it is tough to create a to-be picture from scratch and at the same time understand how to get there. If the method of capture is not into a centralized database, there will be no way to validate, assign ownership, and create the changes to 'fix' processes.

    Looking back at how we treat data, it was easy to move from written ledgers to centralized accounting data, and from the rolodex to centralized CRM data. Before automation, process was drawn by hand, stored in written SOP's and in static diagrams and never made the switch to centralized data. That's a shame, because business process is, essentially, critical enterprise data that absolutely can and must be centralized and treated the same way as customer, employee, financial and other data.

    After data on existing processes is captured in a centralized, governed way, processes can be 'fixed'. It can be done as evolution to the right answers that is non-disruptive (and lower risk) and makes sense for all stakeholders.

  • It seems to me that the question takes a very techie point of view. If I need to "fix" the process - then it must be broken - does that mean that business goal of the process isn't being met? If that is the case - fix the process ASAP whether or not you are planning to use a BPM system.

    If "fix" means the process is getting the job done, but in a less than optimal way, then why not implement it as-is in a BPM system? If the BPMS isn't agile enough to allow you to easily modify the implementation as you learn more about the real needs of the process - you chose the wrong BPMS.

    • Very well said, Jacob.

    • Jacob,
      you raise an important point. The timing of the 'problem fixing' has an influence on who performs the fixing. If it's during the business process design phase, problem solving will usually be done by the business people. But if a problem or an improvement opportunity presents itself after implementation, it's usually regarded as something for the IT people - even though it may 'only' be a pure business problem.

      The pressure point between the two is different. A process that works technically but doesn't perform (businesswise) excerts less pressure on the IT dept than on the business people, just as a technical change will get you a quicker response from IT than from the business.

      Not passing judgement on if it's right or not, just saying how it's usually perceived and acted upon.

      Otherwise, I'm all with you. BPM is a continuous learning curve that should let you start with a 70% solution and allow you to adjust and work your way upward.


  • Organizational change is difficult: getting people to adopt a new tool and a new process can be a lot to take on.

    Process analysis and optimization is great on paper but we don't really know if it is going to work until we get it into production.

    Oh, and analysis paralysis is a too common malaise as we seek to tune the most minute imperfection of inefficiency out of the process.

    Automate and be damned!

    It is infinitely easier to improve an automated process than a manual one. Just make sure you choose an automation tool that makes it easy - 1980's architectures and script driven tools don't cut it - pick a modern, web-services-based tool designed with end-user interfaces in mind and executive dashboards built-in. Don't re-invent the wheel.

  • Optimally, you should document the existing process, identify opportunities for improvement, and design the system to meet your new process specifications prior to implementing a BPMS. However, from a practical standpoint, depending on the number of stakeholders, number of processes and to ensure you get something released in a timely manner, it may be better to attack the implementation in stages. Additionally, users may be more accepting of the changes if they are not dramatic and you use a phased approach. I suggest you target process improvements and changes where you are experiencing the most pain then prioritize other problem areas after initial implementation.

  • Simple answer:

    The only place Automation is ahead of Optimisation is in the dictionary.

    What was the next question?

  • What if AUTOMATION were actually equal to OPTIMIZATION. I guess it's semantics but if I can automate steps in a manual user task (desktop) to say, cut a job time down from 20 minutes to 5 minutes, is that an automation or optimization success?

    The problem with our thinking is that we are all mired in thought by the technology only that we know. Until something new comes along, our thinking is constrained. Enter - Desktop Analytics! The ability to watch EVERYTHING a user does on their desktop to complete a task. It's the missing piece of the puzzle for soup to nuts BPM and task oriented work - Desktop Analytics is a real and growing market. The idea that you can now see and compare every user task, what a user did, where they spent their time, what held them up, what errors they deal with.. And the advantage of Desktop Analytics is that it is agnostic to the application, the servers, cloud web, on-premise, legacy or new.. it just "see's all". Once you have visibility - you have the ability to automate the obvious manual stuff and you keep visibility (Continuous User Process Improvement) to feed BPM, BAM and even help IT and business prioritize projects based upon IMPACT.

  • If processes have to be 'IMPLEMENTED' meaning that there is quite a lot of techie work necessary (i.e. drawing flwocharts, code forms, map data into the process, code interfaces to content, link rule engines) to make them work then it is better if the processes are already optimized before the implementation is performed. Note that this will only work for maybe 20% of processes that a business does.

    So what about all other processes that are dynamic/ad-hoc/adaptive/knowledge work? Can they be discovered by social networking, process mining, and desktop analytics? Nonsense, because process discovery does not mean to let people do whatever they want and then figure out if any of this makes sense! What do you want to mine when people execute a rigid flow? There is no new information to be found. Mine the desktop aribtrarily? What nonsense?

    Before you even can communicate about something (i.e. the business) you need an agreed upon terminology that as a first step defines the objectives and outcomes, value streams or capabilities, the organizational assets (people?), the resources (data and content), and then obviously goals and constraints (boundary rules). That is the non-flow part of BPM that has to be done to be able TO DO BPM in the first place.

    Once you have this business architecture definition, then you can assign process owners and their teams to goals and let them loose. Ideally all this happens through technology empowerment which ensures transparency.

    Clearly, to get the necessary adoption the tool should be perceived as something that is so easy and appealing that people WANT TO USE it. If the target if BPM is to reduce manpower/cost only, then you can just as well do orthodox hardcoded BPM, because you won't get broad support for it anyway. So make your choice ...

  • I think there is not neccessarily a perfect right or wrong answer on this one. "It depends" really fits. The hardest part of doing a process automation is getting people to agree not only what a process looks like currently but what it should look like to be more efficent. Implementing a process as is, can be quicker. Then you can use the BPMS platform and the reporting insights it gives you to now re-work the process based on real valid statistics and measurable insight. To me that makes the most sense most of the time.

  • My initial instict is to agree, but I have to think about the wide variety or organizations, departments, industries and the like. If a company is in dire need of organization, the daunting task of first configuring everything to utilize a BPM system might turn them off. In moving towards a more streamlined business, a proper BPM and the staff supporting it should be able and willing to work within the confines of where the organization is at, even if there is a jumbled mess of papers filling a room. It comes down to identifying the ocmpany needs and moving forward the best way with the best choices.

Add a Reply

Recently Commented On

Monthly Archives