As it has been awhile since we've discussed the IoT on the forum, how big of an impact do you see the Internet of Things having on BPM?
Big impact - We already have IoT impacting in that external devices frequently source data to run-time Case environments hosting BPM process template instances.
These instances can have in-line compliance control steps that consult imported IoT-sourced data and block/unblock processing at the individual instance level.
By way of a simple example I'd like to show an answer...
Imagine a vending machine. Currently these devices are being checked and maintained by a vending machine technician. Refilling is done by yet another role. Products that are sold out, are being refilled on regular basis. But not pro-activily. Same goes for break-downs; only when something is broken and after calling a (repair) phone number, a technician will be scheduled on site...
Enter IoT... What if the machine is able to:
measure trends (what are the popular products, at certain moments) and pro-activily order new stock;
call for repair pro-activily;
remember user favorite;
The drill: Small example but definitely impact the process. Big time. So... IoT does impact process. Combine this with current blockchain, AI, IAM developments and we have a whole bunch of BPM related stuff to work on.
How many times I have seen the phrase "impact on process"... More importantly still and in context of process: I strongly believe we need to get rid of traditional company thinking and move to community and network thinking. Currently we face a growing gap between old thinking (and derivated regulation to name just one) and these exciting new developments.
IoT programmes may
[u]drive more demand[/u]for BPM technology and BPM management methodologies.
[b]Why BPM? [/b]
Because BPM is the best solution to the most challenging IoT problems,
[u]specifically complexity and business semantics[/u].
There is no IoT business value chain if you
[u]drown your end-users in data [/u](e.g. via "alarm fatigue") or if you fail to focus on the fundamental purpose of IoT ("driving decisions").
Both example anti-patterns are really about understanding your business and engineering domain, and encoding that understanding in process and rules. Thus BPM (and business rules and analytics) as
[u]the technology where business ideas are first class citizens of that technology[/u]and can be directly deployed and evolved, without the expensive mediation of code.
[b]Why not BPM?[/b]
Because demand for a BPM solution is not assured. The challenges of IoT business semantics are revealed in IoT programme governance and economics -- paying for the intangibles of business analysis and engineering are hard sales problems. BPM can enable IoT programme success -- if you are willing to step up and budget for the hard work of analysis.
From our previous discussions:
[url="http://bpm.com/bpm-today/in-the-forum/will-more-than-half-of-new-major-business-processes-incorporate-some-element-of-the-internet-of-things-by-2020#reply-3327"]Data and devices are cheap. But business analysis is expensive[/url]
[url="http://bpm.com/bpm-today/in-the-forum/what-new-technology-will-have-the-biggest-impact-on-processes-in-the-near-future#reply-2689"]BPM is the technology of business analysts solving the problem of IoT semantics[/url]
The IoT is incorrectly viewed in terms of smart devices, such as thermostats, refrigerators, or wearables. However, the real power of the IoT is in the data captured by smart devices, and the services and processes that can be triggered from this information. Data has always been the blood in the veins of BPM solutions, used to trigger processes, recommend next-best actions and guide employees. So it would seem then that BPM applications are perfectly suited to orchestrate the flow of information and actions between the smart devices, the consumer and businesses.
However many of the BPM applications we have today are not nimble enough to capitalize on this opportunity. I expect IoT opportunities and services to be delivered by low code platforms and point process solutions? For the most part IoT services will not be characterised by large scale professional services contracts. Instead IoT services will be provisioned rapidly and on demand by the end users.
What impact will the IoT have on BPM? It’ll transform the process automation market completely with many more new market entrants and lead us to reconsider what we think are the core components of a BPMs.
I believe IoT amplifies the use cases to use BPM solutions. Simply put it can connect insight to action. Either because the device/agent requires an action to be fulfilled manually or automatically, there is no better way to delegate these needs to a process. Alternatively, the devices can be less intelligent and send periodic signals with information (or lack of signals as well) so that a central entity analyzes these stream of information and decides to kick off a process. These are things that technically speaking were complex to implement in the past.
As mentioned before, it broadens the applicability of use cases to a new level!
Hate to poo-poo some of the Kool-Aid here (no, not really), but it's another source of data that will aid the onslaught of too much data, not enough insight for anything substantively actionable. Will have about as much overall oomph as blockchain.
I'm good sitting back, waiting and watching on this one.
In our experience the biggest challenge for BPM in IoT is how to go from a classical "request-driven" approach to an "event-driven" solution. Instead of creating "request for spares/leave/stock/loan . . .", BPM for IoT needs to be triggered by key events from real time data that could come from sensors, machines and services. Most BPM applications do not have an event processing front-end capability that connects to these real time data sources to the processes and workflows. These event processing front-ends are often referred to as "Operational Intelligence Platforms" and it is the missing link between IoT and BPM.
Predicting which wells are declining in production on a large oilfield and taking action to rectify it is a typical example of such an event driven approach. In this example the operational intelligence platform “listens” to near real-time data from the team series or historian database. Some additional data is “listened” for direct from industrial control system sensors while real time web services provides additional data for the predictive model that is embedded in the visual, model driven, operational intelligence data stream. Once a well is predicted to be declining, a work order is raised in the EAM system while a failure mode analysis is performed in a classical BPM process. This example can be viewed
[u][url="http://xmpro.com/xm/8mindemo/"]here[/url] [/u]in a short video
The linkage of preference seems to be "push" to a data exchanger at the edge of the application that is hosting BPM process template instances.
Devices clearly need to be able to export data they are authorized to publish in some reasonable format, and there will typically be limitations regarding the ability to format exported data.
Accordingly, the data exchanger might as well accommodate format specifications per trading partner (outbound formatting/inbound parsing).
Insisting on a particular set of data element names does little to facilitate data exchange so part of the setup of for formatters/parsers should include a cross-walk from individual publisher data element names to subscriber data element names.
A side benefit is that the absence of a cross-walk means a subscriber will not receive specific data elements so one push to the data exchanger is sufficient for data sharing.
The timings for postings can be at the discretion of a publisher.
As for pickup (there may be multiple receivers per posting), the frequency of pickup / import can be at the discretion of the subscriber - all the data exchanger needs is a record pointer in the append-only data exchanger main table. If evidence of a read at a data exchanger is not sufficient, more elaborate pointer moving approaches require acknowledge and answerback.
BPM run-time environments where process steps have pre-conditions and post-conditions (i.e. Design By Contract) have few issues with “bad” inbound data. All 1st steps at any process should carry out pre-processing. Steps within processes also need to be able to accommodate pre/post processing.
The inverse is just as simple. Webhooks are very useful for rapid integration as well as event-driven flows.
It addition, we use standard web-sockets to keep a real-time connection open to millions of listening clients - just like chat clients do (like on Facebook - which handles millions of chats simultaneously) - akin to socket.io.
The problem comes when you have a manufacturer like Lockheed with thousands of suppliers/subs submitting data that needs to be referenced in a Lockheed BPMs.
Surely the data pickup and transport and import is best handled by a generic data exchanger at the edge of the consolidating app in a closed community?
This would remove any need for alignment of BPMs.
See my comment above re the practicality of generic data exchangers.
In healthcare, we built, several years ago, an e-hub for 100 clinics where each needed to download/import/permanently store transactions generated by some sub-set of the other 99 clinics.
We used a "standard" format for data export/transport/import and the result was not unexpected - i.e. some 11,000 data elements per consolidating transaction where 90% of the data elements were blank for any single transaction.. Huge overhead!
The plot run resulted in 124,000,000 data base records in the audit table over a very short period of time.
My idea of the way to effect data pickup is for each subscriber to go cap in hand to a publisher and indicate what they truly need, and what data element names they need.
This allows the publisher to post once to a generic data exchanger, the subscribers only get to see what they "ordered" and we leave data transport up to the subscriber (i.e. pick a standard format, or write a formatter so that the subscriber can easily effect imports to the target system).
All stakeholders (publishers and subscribers) have to write parsers for any invented formats so that their data can be posted to the data exchanger.
At present, BPM, primarily, works within companies and very stable B2B (supply-chain?) although some big companies use a few BPM-suite tools (sometimes, a different tool in each business unit).
With the need for inter-companies explicit and machine-executable processes, BPM as a discipline must offer standard capabilities ( think about http://acid3.acidtests.org/ ) thus the BPM-suite tools will compete in compliance and performance (like modern Web browsers).
- Dr Alexander Samarin
- 8 months ago
Big on 2 counts. First the fact that machines as IoT create data which contribute to whatever the required business outcome. The BPM discipline will therefore include this to join up the end to end process with all required attributes such as audit trail real time feedback etc from the supporting software.
The second is when the "gadget" is created there must also be the understanding, control and reporting on the build process ensuring testing accuracy and "compliance" etc. Such an example going wrong was the Volkswagen emission measurements. All this is going to be a bit of culture shock for many but get it right all creators of information contribute to an empowered organisation.....needing fewer managers...!
Some thoughts that struck me in this area (at the end, with some pre-amble at the beginning):
The future of BPM is enabling anyone anyone to build a process by themselves - without IT help in 5 minutes, and also track it all the way through to completion, no matter how many people are involved in approvals or steps. This is in sharp contrast to current approaches - where implementation needs an IT army.
Despite sunk enterprise investments in documenting processes, there does not appear to be a universal and simple way to actually track them - on any device. Hence, the opportunity to leverage past investment lies in making the execution of any given process easy.
In our studies from other customers, only 20% of the most common processes are “worth documenting” or automating since doing that needs IT and process mapping skills - leaving the other 80% subject to tribal knowledge and/or hearsay. This is clearly a huge opportunity to build process IP within any size of business in this undocumented vacuum - reducing the impact of employee churn and training - and growing competitive advantage.
Unless a process is tracked, it cannot be measured - which also means it cannot be managed or improved. Enterprises spent hundreds of millions of dollars on change management and process improvement, including initiatives like Lean Six Sigma. The future of BPM involves potentially eliminating the cost of both - since changes immediately go “live" to everyone, and you get actual (not extrapolated) metrics from people doing steps into your existing BI/analytics tools for executive dashboards.
In the future - if you want to integrate sensor/IoT data on top of people’s activity data, that’s not difficult to do. It will probably be part of large initiatives like digital transformation (cited)
[b]We are firmly of the opinion that IoT/sensor data cannot possibly be useful without the human activity information that has to be superimposed on top of it for root cause analysis and intervention.[/b]
Silicon Valley, CA
I was actually looking for the number of steps in one of your "typical processes".
If the number is around 10-20 steps, and the users don't get into arguments re what the steps are and how they should be sequenced, then 5 min sounds like a good number..
No way, however, they could hope to build forms at each of the 10-20 steps less than 1/2 an hour to 1 hour when the average number of fields per form is 50 and there are broadcast links across forms for some of the fields (explains why initially we use form images, even images of hand drawn forms if no legacy or current digital forms are available).
Some of our customers have 100-200 forms. Most have 50 controls but others can run 2-3 pages and have 150 fields per form with rules that carry out calculations on recorded data.
Rule sets at decision boxes rarely get to the complexity of,say, an ROI, but a complex rule set at a decision box can take a couple of hours whereas a simple rule set is likely to take 10 mins in our environment.
If you organize rule sets by functionality you can some times clone one, give it a new name and pretty much use it as is and that would take about 5 mins per rule set.
When the environment in use features drag/drop, things move forward smartly.
"Without IT help" means precisely that - Zapier users are not tech-savvy, and they integrate apps easily without help. We have a native integration to Zapier.
With form fields on a step, it's just seconds - pick a field, name it, save it.
Were those your questions?
And, qualify "without IT help" by saying this applies to very simple processes with no automated decision box branching, or auto-exec process control steps both of which would require rule sets?
Most of the processes my clients try to build when we are facilitating them in GoToMeeting sessions have 10-20 steps (we try to complete these sessions in 1 hour so as not to start to lose the audience).
For this to work within the time limit, they need to park images of the forms they will be attaching to steps so that as steps are dragged/dropped on the canvas, form images can be quickly attached. Each form has an extra memo field to pick up comments.
Next, we compile the graph, roll it out and have the audience piano play a template instance, the purpose being to see if they agree with the logic, if they agree with the selection of forms at the steps. Any notes go to the extra memo field.
We strive to get to where an instance has been run at least once, notes recorded and important changes made to the original graph.
So, our metric is 10-20 steps in one hour.
The simple answer is yes, and in some domains the imapct will be of a high order. The problem lies in the handling of the vast amounts of data that is generated and received. Real-time predictive analytics uses techniques for processing the data as it is received, and chunking it down to useful information. This is from IBM, "...Internet of Things, sensors can collect and analyze volumes of weather data collected at the edge for faster forecasts",[url="http://phys.org/news/2016-08-ibm-scientists-imitate-functionality-neurons.html#jCp"]http://phys.org/news/2016-08-ibm-scientists-imitate-functionality-neurons.html#jCp[/url]. We have some way to go yet before we fully integrate Internet of Things into BPM.
The only immediate impact of IoT on BPM is that systems will start playing out as more competent / present actors in wokflows.
It's not like solutions that couple data events with actions never existed until IoT... it's just that they will happen faster and closer to the point where event is triggered due to sensors.
Also, I don't equate 100x more data to a technological revolution. As always, cognitive impairment is still the limiting factor.
I can the several viewpoints expressed:
Solution viewpoint, i.e. from a stand-alone Thing to an individually smart Thing empowered by SMAC.
Event capture viewpoint, i.e. the OODA pattern.
Thing-as-a-service viewpoint, i.e. Thing is an ordinary participant in corporate processes.
“Just 100x more of something” viewpoint – But, do the 100 trees form a forest? All depends, as usual.
If we want Things to cooperate dynamically then we need to add a business viewpoint as well. See ref1.
- Page :
However, you are not allowed to reply to this post.