BPM.com
  1. Peter Schooff
  2. Sherlock Holmes
  3. BPM Discussions
  4. Tuesday, 11 September 2018
  5.  Subscribe via email
A question Casper Jans brought up in a discussion, what would you say are the top causes of sub-optimal process performance in a company today?
Accepted Answer Pending Moderation
Beyond other causes, the main one is the missing of end-to-end process vision.
Many companies still have business processes broken by department following an internal orientation, rather than customer orientation.
This vision doesn't allow take all advantages of potential improvements for process performance. In limit, each process part can be otimized by itself, department by department, while the process as a whole continue with a bad performance, due to the poor links between the parties.
Comment
  1. more than a month ago
  2. BPM Discussions
  3. # 1
Accepted Answer Pending Moderation
I think Jose touches on the core issue. I would state it somewhat differently. To me, the top cause of sub-optimal process performance is the subordination of the process to local interests. To focus on processes is by definition to take a cross-functional view of things. To optimize the process requires authority over the portions of those local interests that are also part of the process. The failure to do that leads inevitably to sub-optimal process performance.
Comment
  1. more than a month ago
  2. BPM Discussions
  3. # 2
Accepted Answer Pending Moderation
I agree with the need for an 'end-to-end' vision but not an end-to-end vision of an inventory of "processes".

The end-to-end vision has to be on 'initiatives' or "incidents", or, perhaps, to generalize, Cases (not Use Cases).

The thing is corporations do not manage by process (unless we are talking about a totally automated cement manufacturing process).

They manage by objectives and the only place to park objectives is at a Case, run-time side. (technically at a cursor position in an RDBMS, or in Windows File Folders, for those who like to manage and control work this way).

So, given a Case, with objectives, you can make available at the Case the entire Corporate process inventory as templates, the user threads together what are essentially "process fragments" and the Case Manager periodically assesses progress toward meeting pre-defined Case objectives. You cannot have, plan side, an objective for a process fragment other than to reach an exit node or "the" end node.

What is fascinating is that mining of data collected across Cases can be used by predictive analytics algorithms to suggest at steps along any process fragment template instance what the possible best choices are for the Case (go this way, go that way).

It follows that repeated acceptance of the algorithm's suggestions gives the process designer a notion of how, when, where their inventory of processes can be improved.

The caveat is predictive analytics are not a substitute for "eurekas" - some of the best ideas for improving a process can come from searching Kbases of "resources'.

In Police Department applications I am involved in, we provide our clients with a means to host several thousands of "resource" links - the user can research a topic related to Crime Reduction for a proposed initiative, make a selection and then implement - the next step is to issue a policy narrative (no field work ever gets done in the absence of written policy) . The final step is to flowgraph the policy narrative as a BPM process that gets added to the process inventory.

I get a lot of backtalk about statements that "the only place to park objectives is at Cases".

The rationale, for me, is solid - corporations fund work by way of annual operating budgets or ROI submission authorizations. You cannot just decide to spend $2MM without top management consent. No need to worry about getting top management "on board", - when you are spending $2MM of their hard-earned money, they are, by definition, on board (or too busy playing golf to care).

We encourage our clients to never approve any ROI submission that does not indicate how / to what extent, it will support or augment competitive advantage.
References
  1. http://www.kwkeirstead.wordpress.com
Comment
  1. more than a month ago
  2. BPM Discussions
  3. # 3
Accepted Answer Pending Moderation
How about 50 years of IT who have failed to recognise the power of people and their processes! Jose articulates consequences.

IT basically took over control of data processing even the book keeping as ERP was created claiming to be the answer to one system does it all…. which it clearly does not! Then IT vendors began selling hard coded inflexible silo systems addressing specific functions such as Customer Relationship Management, Supplier Case Management, Case Management, Event Management etc. They are all what is described as “inside out” where users had to adapt to doing what the systems required and so a “gap” emerged between users and such systems which resulted in “off line” activity with use of spreadsheets, databases and other inventive ways outside the eyes of “assurance”.

The concept of the end to end process just was not on IT agenda or even understood!. It was the BPM movement which recognised the need but it has taken decades for supporting software to emerge and be understood and driven by business not old IT. The challenge for many remains the "mess" of legacy and associate fear of failures in trying to change to the new way which reflects the real world of work!
Comment
@David. Good points but you are preaching to the choir.

It's the fault of top management for boxing in IT (i.e. failing to staff it with a reasonable number of "business analysts" not, as we had for one period in time, "digital plumbers").

  1. David Chassels
  2. 3 months ago
  3. #5692
@Karl I think the accounting profession has much to account for allowing it to get to the mess .... they should be helping their business colleagues to refocus on the core where data is used and created; people and their processes.....
@David, Yes, indeed

Accounting / Finance has to take some of the responsibility, especially when IT used to be under Accounting.

Truth be told, many traditional Accounting Departments knew about as much about IT as a pig knows about ice-skating.

Today, you typically find IT Departments reporting directly to top management level.

The residual role today for Accounting is in the validation of ROI submissions by operations and following authorization of a ROI submission, shadowing Case Managers tracking of expenditure / progress at Cases.
  1. more than a month ago
  2. BPM Discussions
  3. # 4
Accepted Answer Pending Moderation
Given this is my own question, I better also answer it (from my perspective) ;)

To me one of the main root causes is the fact that most companies either don't have a process governance in place at all or have a functionally divided one in place. What I mean by that is that if a company does not have a process governance in place, optimizing processes becomes a matter of local initiatives combined with functional ownership of processes and this almost always will lead to sub-optimization because the end-to-end perspective as Jose Camacho already mentioned is not been taken into account.
Companies that are in the early stages of having process governance in place typically divide their process collection again in functional sub-sets and assign owners to each sub-set of processes and when one process owner wants to optimize his or her processes they tend to stick to the limits of their own ownership, hence leading to sub-optimization.

I think @Karl Walter Keirstead# metioned the word Case and I find this interesting. If a company takes on an end-to-end perspective, these end-to-end processes tend to be defined as a set of processes that are subsequently (or in parallel) carried out to support one type of situation (or case type). For example, the purchase to pay end-to-end processes is defined and configured to handle the entire chain of activities for processing a single purchase (all the way up to the paying of the invoice).

Another root cause for sub-optimization, I think, can also be found in the misalignment of the MBO's (or personal targets) between top management and middle management. Especially the MBO's defined on middle management can evoke behavior that could lead to sub-optimization. If you give an Inventory Manager a MBO to reduce inventory by 40%, then this manager could be motivated to take measures that potentially damage the ATP check or the customer engagement score due to more frequent stock-outs. The inventory manager is, in that case, sub-optimizing his part of the program, but damaging the over all end-to-end process of Order to Cash.
BPM is all about mindset first and toolset later....much later
Comment
@Caspar ... Sure, many purchase-to-pay processes can be one process.

But no two "cases" for customizing the interior of private jet orders are likely to be the same. And no two homicide investigation cases are the same.

I have issues with the word "Case" - we can have a Case for an insurance claim, or for a homicide investigation, but the term "incident" is more appropriate for a call to a police station advising that a traffic light is out of service.

In medicine, Case typically refers to what used to be for each patient a manila file folder containing routine visit notes .

I think ""customer record" works better than Case for the history of purchases by a customer.

Any better idea for a "run time workspace that hosts process templates, accommodates task execution and auto-builds a history of interventions"?

  1. Caspar Jans
  2. 3 months ago
  3. #5695
I agree with your comments on the usage of the word Case. I typically don't use it, I prefer to use process instance (a term that is often used in process mining), and yes, two cases for the interior of a private jet are not likely to be the same, the order in which the decorator will work often times is 80-90% the same (but obviously this depends on the level of detail you're looking at it from).

And for the better term: absolutely, I call it EMS (Enterprise Management System) and this incorporates also process performance measurement and dashboarding if you ask me. For me, an EMS spans the entire spectrum of describing (or modeling if you will) the companies strategy, org structure, product portfolio, E2E landscape, all the way down to the individual SOP's, links to automated process execution (if feasible) and more. If you choose the right tooling to support all this, it can all be connected and aligned. #notadream

The "Adaptive Case Management" folks seem to be happy with "ACM" but that term refers to an approach to the management of work as opposed to the name of a workspace and repository for something that needs a focus.

Clearly, one needs different "entities" if you want an EMS where you are , say, managing manufacturing orders and simultaneously managing employee data.

We did a mockup several years ago of all orbiting satellites that included satellites, launch pads, and launch rockets. Each of these entities had their own RDBMS because of the very different data elements needed per entity. We consolidated data from all three entities to a graphic 3D Kbase so that users could engage free-form searches across the entire space.

You can view a screenshot of the Kbase under "Satellite Inventory" at http://civerex.com/pages/civman.html
  1. more than a month ago
  2. BPM Discussions
  3. # 5
Accepted Answer Pending Moderation
Unfortunately, end-to-end process vision is already sub-optimisation in comparison with an enterprise as a system-of-processes. See ref [1]

Thanks,
AS
References
  1. https://improving-bpm-systems.blogspot.com/2014/03/enterprise-as-system-of-processes.html
Comment
  1. more than a month ago
  2. BPM Discussions
  3. # 6
Accepted Answer Pending Moderation
One of most evident but widely neglected reasons for poor process performance in a company is a failure to recognize a process as such. Systematic process mapping is an essential factor for efficient process governance and optimization of business performance. However, organizations often miss to distinguish processes they run and to reveal them as business models.

Another even more typical and closely related reason is that a company creates abstract to-be processes and strives running them while ignoring objective reality. Again, essential mapping of as-is processes is neglected for the sake of an imaginary abstraction often borrowed from a book or ready framework and disregarding crucial aspects of the specific organization.

Not seldom, externally enforced processes are even artificially simulated by workers to satisfy inadequate management requirements. It creates additional inefficiency due to a friction and time loss by workers to observe such a fake compliance. Fictitious processes are not less destructive for an enterprise than double books.

To avoid such situations, companies must always begin their BPM journey from scrupulous mapping of existing process landscape and take all care for timely actualization of business model throughout its whole life cycle. It can be achieved only through wide collaboration of analysts, managers and process owners in a contemporary transparent BPM environment.
Comment
  1. more than a month ago
  2. BPM Discussions
  3. # 7
Accepted Answer Pending Moderation
Organizations have processes, people have jobs that pay their bills and feed their kids, so the main cause is that people act based on what they get paid for.

You can set process norms and kpi's. but if they do not match with employee goals it's useless.
Sharing my adventures in Process World via Procesje.nl
Comment
@Emiel, I agree with organizations have processes, people have jobs.

My view is that processes and people come together at Cases where

1) Cases have goals,
2) Case Managers are expected to manage cases (i.e. plot the course/monitor up an "S" curve to the Case goals), and
3) Case Workers are expected to focus on tasks orchestrated by BPM template instances.

I suppose rules help with the interpretation of tasks so that all work contributes to advancement toward Case goals.

Of course, not everything people do at work takes place within or requires a Case.
  1. more than a month ago
  2. BPM Discussions
  3. # 8
Accepted Answer Pending Moderation
Another cause I've seen is low coding. The powerpoints promise you it's easy to make 16 awesome apps. And that's nice, but 16 awesome apps in a row is not automatically an awesome process.

To make a process, those apps need to be connected, what often means connecting data sources. And that's the harder part. Harder than hacking those awesome apps together.
Sharing my adventures in Process World via Procesje.nl
Comment
  1. more than a month ago
  2. BPM Discussions
  3. # 9
Accepted Answer Pending Moderation
When measured in terms of improved response times, reduced printing, manual errors and case reprocessing, automated processes have the potential of falling short, often due to the BPM team not achieving to convince the stakeholders of “handing over” manual controls and double-control mechanisms to the BPMS. An indicator of that to happen usually are insufficient reductions in process touch points, comparing a pre- vs. a post BPM optimized process. In that sense, optimization is more an antonym than anything else, as, very likely, the manual redundancies have been carried over to an automated process model.
There are other culprits as well - like trying to replicate ERP functionalities within the BPM layer because it seems more intuitive at the first glance.
NSI Soluciones - ABPMP PTY
Comment
  1. more than a month ago
  2. BPM Discussions
  3. # 10
  • Page :
  • 1


There are no replies made for this post yet.
However, you are not allowed to reply to this post.