1. Peter Schooff
  2. Sherlock Holmes
  3. BPM Discussions
  4. Tuesday, 04 April 2017
  5.  Subscribe via email
With bpmNEXT coming up in two weeks in Santa Barbara it seemed a good time to ask: What do you think will be the Top 3 most disruptive technologies to impact BPM (specifically BPMS platforms and BPM vendors) over the next 12-18 months, and why?
Accepted Answer Pending Moderation
I think the relatively new Decision Model and Notation (DMN) will have an important contribution to the world of BPM, bringing concepts and technologies of BPMS and Business Rules Engines closer together. In that sense, with DMN a more complete, standardized and actionable blueprint of business processes can be achieved within a single framework – and yes – within a single suite (eg. Camunda).
That naturally extends, in my mind, also to the Case Management Model and Notation (CMMN) for BPMS with an “ACM flavor” or for pure ACM players.
From a BPMS market point of view, I think we will continue to see a strong trend towards cloud based SaaS offerings from all important players as the go-to standard. Niche players will increasingly focus on vertical, process based solutions (as opposed to selling BPMS platforms) – like we do also at NSI with our human resource management platform (GAP).
I strongly believe that BPM solutions, frameworks and platforms will become progressively more “mainstream-accessible” which should provide fertile ground for additional innovations (such as we saw on the form level with the wizard driven WYSIWYG, low code, form builders many platforms offer). More detailed and realized process templates, maybe (not just the old “hire and p/o process demos”).
NSI Soluciones - ABPMP PTY
  1. more than a month ago
  2. BPM Discussions
  3. # 1
Accepted Answer Pending Moderation
1. Process Mining: Leverage system data to get insights into what is actually happening, speeding up change initiatives dramatically and helping to prioritize improvement / transformation opportunities.
2. Combining Customer Journey maps with the "classic" process/decision/KPI/org/application perspective: This helps to focus on the right things in operations to deliver on the promise of better customer experience.
3. "Less technical" productivity enhancers, like cloud workflow and RPA: Sometimes less rigorous solutions or even simple band aid are a valid alternative to 12-24 months of implementing the perfectly engineered process.
Co-founder & CEO of Signavio
  1. more than a month ago
  2. BPM Discussions
  3. # 2
Accepted Answer Pending Moderation
Let's find the technologies that are likely to have a "disruptive impact" on the the BPMS market over the next 18 months. This question is important for many reasons, including the question of "what technologies are a good investment or a risk-mitigating investment now".

(For the question, we could define "disruptive". Disruption can be defined as change likely to result in changed user or vendor business models and/or technology or business market structure. Such real change is as opposed to incremental change that does not result in clear business model or market structure changes. Geoffrey Moore, among others, has written extensively on this topic, labeling these two types of change as "sustaining" and "disruptive". In either case, such change is very important to business people; disruptive change comes with a much higher level of risk. )

So, to answer the original question, let's start from the set of current technologies with the following characteristics:


Disruptive technologies first of all should be (A) relevant to BPM AND (B) in play now [i.e. "on the Hype curve"] AND (C) which are sufficiently mature to be adoptable by "early majority" users, not just "innovators" [the technology has moved along the "Technology Adoption Curve"].

And we apply the following three questions:

1. Does the candidate technology have a major impact as technology? (Use case)
2. Does the candidate technology have a major impact concerning business? (Business case)
3. Is the cost-to-implement low? (i.e. Manufacturability )

Any technology that meets the first three criteria is a compelling technology for today, especially for a "sustaining innovation". However, if a technology is to be defined as truly "disruptive", it has to meet one more criteria:

4. Does the technology impact on business models and/or market structure? (i.e. Disruptive impact )


On this basis, one strong candidate and one possible candidate at least as sustaining innovation technology investments:

a) DMN technology is a strong candidate as a sustaining technology innovation, because pulling out decisioning from proprietary BPMS rule kludges can make process models much easier to wrangle, the business impact can be significantly impacted by codifying decisioning -- and the technology is comparatively easy to implement.

b) Event Stream Processing is also a possible candidate. IoT momentum with cheap sensors and devices is rapidly escalating the volume of events pumped into systems. And without proper handling the inevitable result is alarm fatigue, which shows up in many sectors. The technology is available (and related to DMN, ML and AI) to handle this use case. And the business case is pressing. Whether the market comes together with end-users to take advantage is an open question because event stream processing (a.k.a. codified business analysis for the information flood) is not quite as easy as DMN and decisioning.

Even if both DMN and ESP are attractive and timely technology investments now, are they only so as sustaining investments? Can we see them as truly disruptive? I would say "yes", but the argument as to why goes beyond this reply. And also, for sure any such disruption will take longer than 18 months.

That's it, just two technologies, in terms of making a significant impact even just as sustaining innovation technology investments. Many other promising technologies are any of "not mature", or "not so easy to implement", or "not much of a business case yet".

I would really like to see process mining as having major short term impact, but I fear the organizational challenges ( "cost-to-implement" ) make only incremental change possible with the technology. Also, process mining is likely earlier in the Hype Cycle, i.e. prior to mass adoption by early majority TAC segment users.
@John, Civerex is getting into infrastructure protection where you may may have 10 technology feeds (i.e. sonar, radar, facial recognition, vibration detection, heat, proximity, 360 video, incoming airborne), some sending out a continuous stream of data (4K video), others only sending out a test signal every 10 seconds that is acknowledged so both sender/receiver know they are still talking to each other.

There are several reasons "event stream processing (a.k.a. codified business analysis for the information flood) is not quite as easy as . . ."

One of these is the time needed to process the signals/data streams - FR, for example is a challenge when you have 100 people in a crowd all moving around. The bad actors are likely to be out of range by the time you figure out who they are. It's like starting to load your shotgun when you spook a bird.

Since the various technologies overlap, you certainly need to avoid alarm fatigue because at time "tt+5" you can easily get an alarm you already received at "tt+2" from a different technology feed and already took action on.

So, we need fairly smart rules - i.e. don't act on the rule that says "call city hall to complain about siren noise" when you earlier called the fire department to report a fire.

Some of our apps have infrastructures where threats can come from water, land, underground and from the air. No one technology feed is adequate.

You may have seen the USAF experiment where they dropped 150 microdrones that have the smarts to swarm like bees, going off into clusters when under attack and then huddling when they are on the attack. The puts to rest folks who went on record saying drones cannot deliver enough of a payload to cause major harm. if I can launch 150, why not 300, or 600.

Anyway, we are planning to have BPM decision trees as the main processing capability for consolidating status at the command center software we plan to put on the market .

Seems to me if you draw a process map horizontally where time is an implied variable, drawing the same map vertically you get a decision tree with nodes/arcs/attached forms\data\rules except that most of the steps have a routing of "system" instead of humans.

We figure this project is going to hit up against a few of the traditional BPM boundaries..
  1. John Morris
  2. 3 years ago
  3. #3720
@Karl -- Wow, your Civerex project covering infrastructure protection and multiple sensor feeds is on the cusp of market need and technology!
You may have seen my article on alarm fatigue from 2014: https://www.linkedin.com/pulse/20140730013817-1524359-what-manufacturing-iot-project-leaders-can-learn-from-healthcare -- interestingly it starts from the world of healthcare with which you are already familiar. My analysis is this: "data is cheap and business analysis is expensive" -- and the almost inevitable result is project failure through alarm fatigue. It very much seems that you are tackling this head on!
Many healthcare alarms can simply be detected and parked for future use IF you have pre-processing at BPM steps.

An incoming BP reading for a patient that reads 160/100 might require immediate action for a patient with a history of 110/70 but not if the patient has a history of 150/100 readings.

On the admin side of any industry app. another advantage to pre-processing is that failure to provide say a zip code at an address block on a form where a less sophisticate rule set would declare all of the data elements as "mandatory", a smarter rule set can issue a simple prompt "zip code missing - would you like to input it now?" If the user responds NO, later on when the processing gets to a step called "mail letter", pre-processing will cause a hard stop - many times the missing info will have been entered in the meantime so the hard stops are averted.

The option where you hold the user hostage at the form, waiting for the missing zip code OR hold up all of the rest of the address data from being recorded is impractical.
  1. more than a month ago
  2. BPM Discussions
  3. # 3
Accepted Answer Pending Moderation
Disruptive is a great word, but so usually misused that could be misunderstood.
I will use it here as Wikipedia defines Disruptive Innovations:

A disruptive innovation is an innovation that creates a new market and value network and eventually disrupts an existing market and value network, displacing established market leading firms, products and alliances.

In this context, I think that:

  • Low code (or no code) BPM Suites are creating a new market for Small or Medium Businesses, that couldn't afford traditional enterprise BPM Solutions and now have an affordable option (in time and money).
  • IFTTT technologies are a threat to established BPM vendors, and if their massive adoption continues, they may grow in features and functionalities displacing established market leading firms.
  • Decision Model and Notation (DMN). I agree with Kay Winkler and John Morris, it is creating a new market and if it gains momentum could become very disruptive.

Best for all !
CEO at Flokzu Cloud BPM Suite
  1. more than a month ago
  2. BPM Discussions
  3. # 4
Accepted Answer Pending Moderation
The majority of BPM-suite tools tries to be leaders in many dimensions at the same time, e.g. application development environment, application development environment, data integration, AI, process modelling and process execution. Because some of these dimensions are not “core business” of BPM then there is certainly “lost-of-focus” situation. Other negative factors are the lack of proper standards in the BPM domain, monolith application architectures and artificial barriers between BPM and business architecture. All of these make BPM-suite tools/platforms rather vulnerable.

Potential directions of disruption technologies:

- Better application architecture via use of events, microservices, serverless computing, microflows, etc. See ref1, point #5.

- Platfromisation of enterprise tools for data modelling, UX, visualization, etc. makes unnecessary to have them in each BPM-suite tool.

- Boom of, other than “flow-chart”, techniques for coordination of activities, such as IFTTT, PRA, event streams, etc.

- Bridging the gap between illustrative and machine-executable models (DMN is an example) also removes some “non-core” load from BPM-suite tools and allows software-defined enterprises. See ref2.

  1. https://thenewstack.io/serverless-architecture-five-design-patterns/
  2. http://bpm.com/bpm-today/blogs/1157-executable-architecture-of-software-defined-enterprises
  1. John Morris
  2. 3 years ago
  3. #3715
Powerful insight regarding the on-going "disaggregation" of "all-in-one" technology packages, including technology. Such a process is also a "technology impact" -- and even a potential guide to investment (or disinvestment). Sometimes market tipping points sneak up on us and then we wake up one day and find that "everything is different".
  1. more than a month ago
  2. BPM Discussions
  3. # 5
Accepted Answer Pending Moderation
AI/Machine Learning. Whether built into a BPM platform, or as a third-party product integrated with the BPMS, AI capabilities will become increasingly important in avoiding potential risk, determining next best actions, and streamlining end-to-end processes.

From there it's only a short hop to Skynet, but we can still enjoy it while it lasts.
Scott's opinions only. Logo provided for identification purposes only.
  1. more than a month ago
  2. BPM Discussions
  3. # 6
Accepted Answer Pending Moderation
The global distribution of the simplification for the software that supports all the BPM needs without need to code. As discussed previously such capability contained in a Platform and one that business can understand and have the confidence it will support inevitable change. It will be very disruptive as will apply to large organisations at SMB pricing....looking good for the next 12 to 18 months.
OMG may be the problem holding up move on from BPMN and clearly outdated standards for workflow etc. There is a much quicker and simpler way which is well proven to build as I have articulated....
  1. more than a month ago
  2. BPM Discussions
  3. # 7
Accepted Answer Pending Moderation
(this was not a comment to the text of another user, so I replaced my reaction which is on the question of Peter)
We have standards maintained by the OMG for workflow and rules; BPMN and DMN. What we are missing is a standard for forms. This could be in json format. I am building a BPMS at the moment with a formbuilder that persists formdefinitions (including field and form level validations) in json, these definitions are runtime rendered via websockets (the BPMS is fully web based). These websockets are used for soft realtime communication, using so called "green threads" (https://joearms.github.io/2013/04/02/Red-and-Green-Callbacks.html). I'm not using node.js for this realtime communication. Process info is sent realtime to clients, broadcasted by the server. So no F5 needed to see current info on a taskoverview, for example. The serverpush notifies users also of tasks that are near their due date. The environment I'm using is perfectly suited for a distributed BPMS and easy fault-tolerance.
  1. more than a month ago
  2. BPM Discussions
  3. # 8
  • Page :
  • 1

There are no replies made for this post yet.
However, you are not allowed to reply to this post.