That naturally extends, in my mind, also to the Case Management Model and Notation (CMMN) for BPMS with an “ACM flavor” or for pure ACM players.
From a BPMS market point of view, I think we will continue to see a strong trend towards cloud based SaaS offerings from all important players as the go-to standard. Niche players will increasingly focus on vertical, process based solutions (as opposed to selling BPMS platforms) – like we do also at NSI with our human resource management platform (GAP).
I strongly believe that BPM solutions, frameworks and platforms will become progressively more “mainstream-accessible” which should provide fertile ground for additional innovations (such as we saw on the form level with the wizard driven WYSIWYG, low code, form builders many platforms offer). More detailed and realized process templates, maybe (not just the old “hire and p/o process demos”).
2. Combining Customer Journey maps with the "classic" process/decision/KPI/org/application perspective: This helps to focus on the right things in operations to deliver on the promise of better customer experience.
3. "Less technical" productivity enhancers, like cloud workflow and RPA: Sometimes less rigorous solutions or even simple band aid are a valid alternative to 12-24 months of implementing the perfectly engineered process.
(For the question, we could define "disruptive". Disruption can be defined as change likely to result in changed user or vendor business models and/or technology or business market structure. Such real change is as opposed to incremental change that does not result in clear business model or market structure changes. Geoffrey Moore, among others, has written extensively on this topic, labeling these two types of change as "sustaining" and "disruptive". In either case, such change is very important to business people; disruptive change comes with a much higher level of risk. )
So, to answer the original question, let's start from the set of current technologies with the following characteristics:
Disruptive technologies first of all should be (A) relevant to BPM AND (B) in play now [i.e. "on the Hype curve"] AND (C) which are sufficiently mature to be adoptable by "early majority" users, not just "innovators" [the technology has moved along the "Technology Adoption Curve"].
And we apply the following three questions:
1. Does the candidate technology have a major impact as technology? (Use case)
2. Does the candidate technology have a major impact concerning business? (Business case)
3. Is the cost-to-implement low? (i.e. Manufacturability )
Any technology that meets the first three criteria is a compelling technology for today, especially for a "sustaining innovation". However, if a technology is to be defined as truly "disruptive", it has to meet one more criteria:
4. Does the technology impact on business models and/or market structure? (i.e. Disruptive impact )
On this basis, one strong candidate and one possible candidate at least as sustaining innovation technology investments:
a) DMN technology is a strong candidate as a sustaining technology innovation, because pulling out decisioning from proprietary BPMS rule kludges can make process models much easier to wrangle, the business impact can be significantly impacted by codifying decisioning -- and the technology is comparatively easy to implement.
b) Event Stream Processing is also a possible candidate. IoT momentum with cheap sensors and devices is rapidly escalating the volume of events pumped into systems. And without proper handling the inevitable result is alarm fatigue, which shows up in many sectors. The technology is available (and related to DMN, ML and AI) to handle this use case. And the business case is pressing. Whether the market comes together with end-users to take advantage is an open question because event stream processing (a.k.a. codified business analysis for the information flood) is not quite as easy as DMN and decisioning.
Even if both DMN and ESP are attractive and timely technology investments now, are they only so as sustaining investments? Can we see them as truly disruptive? I would say "yes", but the argument as to why goes beyond this reply. And also, for sure any such disruption will take longer than 18 months.
That's it, just two technologies, in terms of making a significant impact even just as sustaining innovation technology investments. Many other promising technologies are any of "not mature", or "not so easy to implement", or "not much of a business case yet".
I would really like to see process mining as having major short term impact, but I fear the organizational challenges ( "cost-to-implement" ) make only incremental change possible with the technology. Also, process mining is likely earlier in the Hype Cycle, i.e. prior to mass adoption by early majority TAC segment users.
There are several reasons "event stream processing (a.k.a. codified business analysis for the information flood) is not quite as easy as . . ."
One of these is the time needed to process the signals/data streams - FR, for example is a challenge when you have 100 people in a crowd all moving around. The bad actors are likely to be out of range by the time you figure out who they are. It's like starting to load your shotgun when you spook a bird.
Since the various technologies overlap, you certainly need to avoid alarm fatigue because at time "tt+5" you can easily get an alarm you already received at "tt+2" from a different technology feed and already took action on.
So, we need fairly smart rules - i.e. don't act on the rule that says "call city hall to complain about siren noise" when you earlier called the fire department to report a fire.
Some of our apps have infrastructures where threats can come from water, land, underground and from the air. No one technology feed is adequate.
You may have seen the USAF experiment where they dropped 150 microdrones that have the smarts to swarm like bees, going off into clusters when under attack and then huddling when they are on the attack. The puts to rest folks who went on record saying drones cannot deliver enough of a payload to cause major harm. if I can launch 150, why not 300, or 600.
Anyway, we are planning to have BPM decision trees as the main processing capability for consolidating status at the command center software we plan to put on the market .
Seems to me if you draw a process map horizontally where time is an implied variable, drawing the same map vertically you get a decision tree with nodes/arcs/attached forms\data\rules except that most of the steps have a routing of "system" instead of humans.
We figure this project is going to hit up against a few of the traditional BPM boundaries..
An incoming BP reading for a patient that reads 160/100 might require immediate action for a patient with a history of 110/70 but not if the patient has a history of 150/100 readings.
On the admin side of any industry app. another advantage to pre-processing is that failure to provide say a zip code at an address block on a form where a less sophisticate rule set would declare all of the data elements as "mandatory", a smarter rule set can issue a simple prompt "zip code missing - would you like to input it now?" If the user responds NO, later on when the processing gets to a step called "mail letter", pre-processing will cause a hard stop - many times the missing info will have been entered in the meantime so the hard stops are averted.
The option where you hold the user hostage at the form, waiting for the missing zip code OR hold up all of the rest of the address data from being recorded is impractical.
You may have seen my article on alarm fatigue from 2014: https://www.linkedin.com/pulse/20140730013817-1524359-what-manufacturing-iot-project-leaders-can-learn-from-healthcare -- interestingly it starts from the world of healthcare with which you are already familiar. My analysis is this: "data is cheap and business analysis is expensive" -- and the almost inevitable result is project failure through alarm fatigue. It very much seems that you are tackling this head on!
- John Morris
- 2 months ago
I will use it here as Wikipedia defines Disruptive Innovations:
A disruptive innovation is an innovation that creates a new market and value network and eventually disrupts an existing market and value network, displacing established market leading firms, products and alliances.
In this context, I think that:
- Low code (or no code) BPM Suites are creating a new market for Small or Medium Businesses, that couldn't afford traditional enterprise BPM Solutions and now have an affordable option (in time and money).
- IFTTT technologies are a threat to established BPM vendors, and if their massive adoption continues, they may grow in features and functionalities displacing established market leading firms.
- Decision Model and Notation (DMN). I agree with Kay Winkler and John Morris, it is creating a new market and if it gains momentum could become very disruptive.
Best for all !
Potential directions of disruption technologies:
- Better application architecture via use of events, microservices, serverless computing, microflows, etc. See ref1, point #5.
- Platfromisation of enterprise tools for data modelling, UX, visualization, etc. makes unnecessary to have them in each BPM-suite tool.
- Boom of, other than “flow-chart”, techniques for coordination of activities, such as IFTTT, PRA, event streams, etc.
- Bridging the gap between illustrative and machine-executable models (DMN is an example) also removes some “non-core” load from BPM-suite tools and allows software-defined enterprises. See ref2.
- John Morris
- 2 months ago
From there it's only a short hop to Skynet, but we can still enjoy it while it lasts.
- David Chassels
- 2 months ago
We have standards maintained by the OMG for workflow and rules; BPMN and DMN. What we are missing is a standard for forms. This could be in json format. I am building a BPMS at the moment with a formbuilder that persists formdefinitions (including field and form level validations) in json, these definitions are runtime rendered via websockets (the BPMS is fully web based). These websockets are used for soft realtime communication, using so called "green threads" (https://joearms.github.io/2013/04/02/Red-and-Green-Callbacks.html). I'm not using node.js for this realtime communication. Process info is sent realtime to clients, broadcasted by the server. So no F5 needed to see current info on a taskoverview, for example. The serverpush notifies users also of tasks that are near their due date. The environment I'm using is perfectly suited for a distributed BPMS and easy fault-tolerance.
- Page :
However, you are not allowed to reply to this post.
Join the Discussion
Webinar:Your Digital Disruption Survival Plan
Thursday, July 13th @ 1:00pm Pacific/4:00pm Eastern
In this fast-paced and informative roundtable discussion, three leading experts on business process management will explore the right methods and capabilities that make difference between success and failure with enterprise process management initiatives