1. Peter Schooff
  2. Sherlock Holmes
  3. BPM Discussions
  4. Tuesday, 15 January 2019
  5.  Subscribe via email
As BPM still means so many things, what parts of BPM would you say are no longer relevant to enterprises today?
Accepted Answer Pending Moderation
A lot of people consider ISO-9000 to be dead in BPM context.

IDEF0 is probably rather dead than alive.
  1. more than a month ago
  2. BPM Discussions
  3. # 1
Jim Sinur
Blog Writer
Accepted Answer Pending Moderation
How about rewording to "What Should Die in BPM"?

1. The attitude that processes are the center of the universe rather than a great approach to seek end business outcomes while dealing with change.
2. Process that provide very low/no visibility to process participant; especially clients
3. Processes that are only reactive in nature; should become proactive
4. Processes with hard coded rules and sequences
5. Processes that are not guided by dynamic governance
6. Processes that do not include other important automation capabilities like RPA, Process Mining and Machine Learning
7. Process that are not linked to customer journeys
  1. https://jimsinur.blogspot.com/2017/09/a-you-proud-of-your-customer-experience.html
  1. Caspar Jans
  2. 1 year ago
  3. #5884
Generally I agree with much of the things you write about BPM, but this list made me raise my eyebrows slightly.
If you rigorously apply your list to a generic set of business processes from any random company, you end up with only a handfull processes. What would be the reason not to document processes that have no other important automation capabilities in them? RPA, ML, AI are nothing but tools that automate process execution. I yet have to encounter the first RPA implementation that actually automated (through robotics) and entire process, it's typically focused on tasks, and as a result of that it's rather an irrelevant deciding factor whether a process should be included in BPM or not.
Same thing with processes that are not linked to customer journeys. I would say that especially some of those processes (indirect procurement, HR, finance, IT in many cases) are candidates to be included in your BPM environment becasue these processes need to be standardized as much as possible (and automated if applicable).

Processes might not be the center of the universe but they are right in the center of the way of working of any business. A process, in my opinion, is one of the only enterprise assets that has the capability to link together almost all other enterprise resources and as a consequence finds itself right in the middle of the sphere of influence. So, in my perspective there is nothing wrong with this attitude, in fact more companies should think about this attitude and adopt it (not permanently but certainly until they have their internal alignment of enterprise resource up and running).
  1. more than a month ago
  2. BPM Discussions
  3. # 2
Accepted Answer Pending Moderation
Agree with Jim on "what should die in BPM" - here's my list:

1. BPM vs ACM debates - the world is far more complex than those two answers would let us know
2. Endless queries about the definition of business process - the only definition that matters is the one you and your customer agree upon.
3. Religious definitions of the best customer experience solely as "service excellence". The best customer experience is very different by customer - for some it is "excellent service", for others it's "fast", for others it's "simple", for others is "thanks for taking it slow with us", for others it's "the cheapest we can get", for others it's "if you're not expensive, you're not serious". Again, the best definition is what your customer tells you it is, not what you've been preparing your company for. Uncomfortable, right?
CEO, Co-founder, Profluo
  1. more than a month ago
  2. BPM Discussions
  3. # 3
Accepted Answer Pending Moderation
I would still hold onto the opposite notion of morbidity in BPM. From a mere technical perspective I don’t really consider any specific component as obsolete. To the contrary, as stated in last posts during the end of 2018, many ramifications in the fields of robotics and data science are in fact building upon a mature foundation modern BPMS bring to the table, nowadays.
Maybe region specific to emerging markets, I do think that ACM as a evolutive next step of BPM has lost some traction. As did the idea of BPM being a stand-alone approach for continued operational improvements. It’s far more common to see BPM to be paired with ERP, ECM, BI and even AI initiatives than at any time during the past.
NSI Soluciones - ABPMP PTY
  1. more than a month ago
  2. BPM Discussions
  3. # 4
Accepted Answer Pending Moderation
Why is "process" not the center of the universe, and why should such an attitude fade away?

I propose the opposite, that winners will increase their bets on the discipline and technology of process and process automation.


Because the work of business is work.

And BPM is the technology of the work of business.

There are other unique or irreducible technologies which are complementary to process technology, such as database, UX, business rules, analytics algorithms, interfacing etc. But by definition, BPM technology and only BPM technology is the technology of the work of business.

Every software product can be abstracted for the work that it performs. For example, take an ERP package on one end of the scale or a desktop graphics editor on the other. We use these items of software to perform work. Lots of times the steps of the work are "in our heads". But "in our heads" is not scaleable. And lots of times the steps of the work are "implicit in our software". But "implicit in our software" is not scalable.

So at some point, we make work processes a first-class citizen of our software technology product. We can now "manufacture" the artefacts of work process automation an order of magnitude faster than would be possible than if we construct our implicit workflows with code. And further, change is easier too! Because BPM-based work process artefacts ( "workflows" ) are much more easily evolved. By definition.

To say that process is not at the center of the universe of business would be comparable to saying that accounting is also not at the center of the universe of business. They both are. That's just how it is. Accounting is a more mature technology. Work process automation technology is a few hundred years behind accounting.

Want to win in 2019? Double down on mastering BPM software technology.

Here's my multi-part article series on BPM.com which explores these ideas:
BPM Technology As Revolutionary Enabler
A multi-part series presented by BPM.com exploring the reasons why BPM software technology is the most important technology for business transformation.
  1. more than a month ago
  2. BPM Discussions
  3. # 5
Accepted Answer Pending Moderation
What is fading or should fade away where BPM software technology is concerned?

I propose BPM products where process modeling and process execution are deployed via two separate notations on two separate engines should fade away.

This sounds like an esoteric, "techy" topic. But in fact this topic is at the core of the realization of the promise of BPM software.

The problem is this: BPM software demos work great! Executives love them! They imagine accelerating their deployment of great new business ideas. Especially exciting is when you "push the button" and voila, you have a new software application, that would have taken weeks or months otherwise!

If the actual application you've built is deployed on the same engine using the same notation (e.g. exectutable BPMN), then your inevitable desire to evolve your process -- how about tomorrow! -- will also be very easy!

BUT if in fact the deployed software is executing on a different engine, using a different notation, you are now faced with a swamp of problems. You have in fact bought "two-for-one", but in this case, it's no bargain. Because inevitably, your BPM software will need to be fine-tuned for technical details in your existing environment. Your developers (i.e. not your business analysts, unfortunately) will have to fine-tune the usable deployed BPM process. Maybe to fix up database connections, APIs, details of processes etc. And they will, of necessity, make these changes in the deployed process.

Now of course it's easy to see that you have two process models. One model deployed and functioning. And the other original model in the modeling tool itself. The two models are now out of sync. Propagating changes backwards from the deployed model to the original model is technically very difficult. And it is also a cost. This is the so-called "synchronization problem" or the "roundtrip problem". Practically speaking what happens is that the changes rarely get propagated backwards. The original modeling tool which executives loved might as well be an expensive static clone of Visio. And all future changes -- including real business changes -- are now done in the "programmers-only" executable.

I mentioned that this is an esoteric technical problem. It is also the reason that many BPM programs lost momentum. Because over time, the roundtrip issue guarantees that executives will see less and less of the promised productivity advantages of BPM. (It's worth noting that even native executable BPM products often still have to be "fixed up" by developers; in such cases however, the fact that the team is still working on a common process model means that roundtrip issues are vastly diminished.)

The work of Volker Stiehl should be mentioned here. Whatever the generation of BPM sofware you are using, the practice of strictly segregating business logic from technical deployment artefacts is very important: Challenges of Being a BPM Pioneer: Part 4 - Building Work: Deploying Process Automation Artefacts

BPM software technology continues to improve. The math behind BPM engines (which is apparently very difficult) continues to improve. And the result is that the promise of "imagine -- draw -- execute" for process automation is closer and closer. Business executives and business analysts immediately grok the value of new generation BPM. (It would be a great research note for analysts to do an inventory of BPM software installed based as to what percentage of deploys fall into what modeling/execution category.) For those who have had disappointing experiences with BPM, chances are you are using older technology, now superceded. Time to try BPM again. It's quite likely your experience will be much better.

I agree “winners will increase their bets on the discipline and technology of process and process automation”

I have a problem though with “ . . the actual.application you've built is deployed on the same engine using the same notation” - I can’t see the benefit of a vendor putting out only a mapping engine and I can’t see the benefit of a vendor putting out only a run-time process execution engine.

So, the only “winners” are those who roll out plan side AND run-time side products and these folks are perfectly capable of deciding to their advantage or disadvantage whether one engine is needed or whether two engines are needed. User communities need not worry about such details.

As for ‘notations”, the best implementations are the ones that make minimal use of ‘notations’ - we have itemized flow graphing needs many times here at this forum and the essentials, in my view, remain a) nodes b) directional arcs that link nodes, c) branching decision boxes, d) loopbacks e) ability to embed pre-post processing rule sets.

The problem . . . “the two models are now out of sync” has been a problem but we now see data mining / predictive analytics being able to overlay a plan-side flowgraph such that the user can know get prompts “ . . 40 % went this way, 60% went that way”, which helps dynamic decision making and no great problem either to auto-update suggestions for changes to plan-side flowgraphs.

Probabilistic branching was part of GERT a network analysis technique used in project management that allows probabilistic treatment both network logic and estimation of activity duration. The technique was first described in 1966 by Dr. Alan B. Pritsker of Purdue University and WW Happ

RPA is going to dramatically reduce building links to / from APIs.

I have no problem with “BPM technology and only BPM technology is the technology of the work of business” so long as “BPM technology” picks up Resource Allocation, Leveling and Balancing (RALB).
  1. John Morris
  2. 1 year ago
  3. #5883
@Walter -- for sure adding RALB is a major addition to value -- it mashes into simulation as well.
  1. more than a month ago
  2. BPM Discussions
  3. # 6
Accepted Answer Pending Moderation
Interesting discussion here.

I find this to be a difficult question becasue I don't think there are that many things in BPM that should die in the first place. I always compare BPM to building and maintaining a house. Yes, we do things differently nowadays when building a house, but the main concept is and will remain largely the same. You make a drawing of the house (or your process architecture), you break down this design into more detailed drawings on plumbing, electricity, walls, floors, windows etc until you reach the level that the contractor needs to be able to actually start building the house (or your employees to actually start putting in a sales order in your ERP system).

The fact that robots will start helping us build a house, does not imply we no longer need a foundation in the ground, or we no longer need support columns left and right. Simply because something has gone out of fashion, doesn't mean it should die.

The fact that the process of laying bricks for the walls in your house (which are in the end no longer visible to the customer because they have been plastered) does not mean that we should no longer lay bricks.

BPM is one of the most comprehensive and encompassing concepts out there for a business. It should never dictate the strategy of the company, but for sure it should direct many of the other dynamics going on in a business, such as translating the strategy into an operating model, end to end processes and all the way to the detailed SOP's. It alignes the internal resources and provides governances (static and dynamic) to make sure the aligned state remains aligned, certainly in the light of the changing environments (legal, fiscal, logistics, environment etc).

Alright, I've spoken my mind, back to work now...doing BPM stuff
BPM is all about mindset first and toolset later....much later
@Caspar +1
  1. more than a month ago
  2. BPM Discussions
  3. # 7
Accepted Answer Pending Moderation
BPM is a discipline a way of thinking how people work to create new information/ data. There is no limitation in this nor will there be in future so nothing really "dies". The real issue is how the supporting software delivers and here is where the evolution inevitable sees dying aspects much as already described. The low code no code "LCNC" is where the new begins and the old will be cremated!

John is right the the future is "Imagine-Draw-Execute" (IDE?) with one engine doing it all, including RPA, in the hands of business allow a click and ready to run. This has to be a enterprise level and must always be under scrutiny of accountability. This already exists and has been articulated by many is very disruptive which in effect means any software that does not reach that "dream" as Bill Gates articulate some 10 years ago will have no future. This is a huge challenge for existing large vendors as pricing becomes fraction of the old ways. BPM is and always will be the driver and looks like this year could be "interesting" as more supporting LCNC arrives on the scene....?
"Imagine-Draw-Execute" sums it up nicely, unfortunately "IDE" is taken i.e. ".. a source code editor, build automation tools, and debugger."

"This already exists" is accurate .

I agree that in a perfect world of enlightened customers, vendors selling yesterday's methods/tools would need to pull up their socks.

The problem is not all customers are "enlightened" and the vendors (they should by now know who they are) have no incentive to help such customers to become enlightened.

So, for 2019, by all means, bring on LCNC but my view is that the current state of affairs will continue for another few years.

LCNC needs to be supplemented by data mining, predictive analytics and process map overlays to 1) improve user-level run-time workflow/workload decision-making and 2) to make live easier for folks whose job it is to engage in "continuous process improvement".
  1. more than a month ago
  2. BPM Discussions
  3. # 8
Accepted Answer Pending Moderation
Karl I think Map is a better description than Draw so "IME" ?

You hit a real issue about enlightened customers and sadly the accounting profession over past 50 years has lost to "IT" the role of understanding and ensuring accountability for data creation. The industry analysts are "too close" to vendors so we have this real gap in helping the enlightenment of the business customer. The Arrival of Low Code No Code LCNC opens the door for a professional approach by accountants to take on this important role.....I am working on it!!

LCNC does need to use data from any source and in built RPA is essential which maybe direct or plug into a data mining tool. Its not only predictive analytics I see as more intelligent processes which can dynamically present relevant options based upon previous decisions etc; all of which help the user experience and productivity.

It is vital that any LCNC delivers the ability to readily change the business process as circumstances inevitably change. This must be in hands of business not IT who of course remain in support for secure delivery etc. The effect of LCNC movement is that the business buyer makes a future proof investment and empowers its workforce who now can directly input new ideas of how to best achieve the outcomes. One of our early adopters is now in 18th year with basically same application but looks quite different as has been able to readily change as required.

As for 2019 well I am an optimist ……. !
@David... Interesting . . . . "LCNC does need to use data from any source"

Agree. Important that any/all such data arrive at the LCNC platform via "official" interfaces to avoid bypass of security AND to allow rule sets to operate on incoming data.

As for data exports, many industries need to operate under "need-to-know" rules because a data exchanger can have hundreds of subscribers, most of which do not need all of the data elements that a publisher may be willing to share.

The other thing is no incoming data should arrive at the LCNC platform without a date/timestamp/'signature' recording at Case Histories with easy access to what data was imported. It's good practice to not confirm receipt of any data from the outside world until your LCNC confirms that the date/timestamp/signature/data processing is complete.

Some data exchangers report "done" on "read" which I find unacceptable - in medicine, a primary rule is that if an intervention is not in the chart, it did not take place.

The only problem with shipping out data for remote processing and getting back processing results is that whereas you can snapshot the export and the import, unless you are the author of the engine that did the processing and you retain copies of back releases of executables, you may never be able to re-run the export data and get back the same processing results. This presents a challenge to blockchain.
  1. more than a month ago
  2. BPM Discussions
  3. # 9
  • Page :
  • 1

There are no replies made for this post yet.
However, you are not allowed to reply to this post.