BPM.com

The Three R's of Intelligent Automation

Three Rs of Intelligent Automation

Listen to the Podcast

So this is our fifth year at this venue, and we began a tradition five years ago of kicking off with a forward looking session. Which is enjoyable to do, it's a great pleasure because as you look out five years you can make these pronouncements, which can't be proven wrong and everybody forgets. So I've enjoyed this quite a bit. What I'd like to do this time though is combine that and actually look at what were these predictions that were made going back. Starting with 2015 in our first one, and to see how these hold up.

So the first one, the first notable one in 2015 was we had identified that we were really at an influction point in the market. That clearly BPM as we understood it what was undergoing change and that there was this emerging set of capabilities and the need for new labels. Up to that point the notion of automation was largely anathema. We talked about that being something that we needed to get away from that we needed to focus on agility and adaptability, and that automation was seen as something that was fairly rigid. And not very smart, so I think I coined this phrase at this time, if anybody can disprove this I would actually like to know. But looking for what was the label for this new generation of BPM in 2015. We described it as intelligent automation, which now has become the term of art, and the contemporary way of describing what we're doing today.

And it's something that is inherently data driven, that it's goal oriented, and that it's adaptive. In that same presentation and what is held up over that time is I coined what I called the three R's. For BPM and for intelligent automation, that the challenge for BPM and the new problem set that we're looking at is how we integrate and leverage the inter relationship between rules, relationships and robots. Now in 2015 this notion of robots being meaningful participants in our work was something that we all could envision but seemed really far away. Yet in that time we've seen that indeed the robots frankly are here. We had actually integrated robots being driven by BPMN in a presentation last year. And the discussion around robots, now it is really dominating.

 

There's a lot of fear and angst around robots not just from the standpoint of displacing work and workers but this notion that robots are fundamentally scary.

And there's a lot of fear and angst around robots not just from the standpoint of displacing work and workers but this notion that robots are fundamentally scary and they're disruptive nature to our operations, to our processes. But the reality is robots aren't really that scary. The fact is robots are inherently driven by rules, that robots require rules to be able to work in the organization. And that's what we've seen now with the emergence of intelligent automation as a means for managing robots as part of the digital workforce.

 

We started to use robots and AI interchangeably and the reality is AI today typically when we say AI we're thinking in terms of learning systems, of machine learning and neuro nets. And when we're talking about robots we're talking about something that is very prescriptive. We can call RPA deterministic AI, in that it's something that's driven entirely by instruction sets. We don't have RPA that's out making decisions on there own other than that which is driven by business rules, where we are using AI to be able to bridge the gap. Where we have ambiguity around rules. So from the standpoint of RPA and robots that the R is for robots, the R is also for rules. That rules are inherently unambiguous, and that there's no room for nuance or ambiguity with robotic automation.

So one of the predictions for this year is that ... and this is not a very controversial prediction necessarily, but it's one that I put out there to think about. And hopefully we come back next year and the year after and look at this and think about how this is changing. That at least half of our work and our interactions, our commercial transactions, our business transactions and actual production work will be performed by a digital workforce or by robots. So the third R is for relationships, and when we were starting at this point in 2015 BPM and software architecture still had a pretty heavy integration component. But we're starting to see the shift towards providing intelligence to know how to get data as opposed to having to replicate and store that data locally.

And increasingly we see this as a driving element for both intelligent automation and likewise intelligent automation as a means for enabling this. And when we talk about digital transformation it's still generally ambiguous term. This is one of the fundamental aspects of it, the notion that if you think about your customer that we no longer have a customer data repository. We have the means to define that customer but the dimensions of that customer exist on a multitude of systems. Many of which are outside of our control, and at any given time that we need the context about that customer we don't go to our local store alone. We might be going there for guidance and to know where to look. But ultimately we're pulling in data from a variety of sources. And when we're acting on that in the context of processes in automation, that we're interacting with systems that are outside of the span of our control.

 

By 2022...the majority at least approaching or exceeding two thirds of commercial transactions will be occurring on systems that are outside of our span of control.

So our next prediction is that by 2022 and again, not entirely controversial we're pretty close to this today. But that the majority at least approaching or exceeding two thirds of commercial transactions will be occurring on systems that are outside of our span of control. That our core means of conducting business will rely not just on a third party cloud system that we are engaging. In this case we don't mean cloud as a service as much it is the external systems, the external edge points that are part of our business eco system. And that our end to end processes is engaging with these systems that are outside of our span of control.

 

We did research that we're about to publish that we conducted last quarter. And one of the focuses of this was to look at for customers today, how many systems are required to represent a customer and to perform core work. And we found that the average number of systems required to conduct core operations is just over 13. But the majority of those are external to operations. And that the vast majority, 80% site of 10 or more systems. So we're already at this point now where our core business is reliant upon the coordination and orchestration of systems that are outside of our control. And that's a fundamental opportunity and problem statement for intelligent automation.

So the three R's that in 2015 we said, ultimately would define the future of BPM these are also what's critical to the definition of intelligent automation. That the role of robots and robotic software and rules and relationships are what ultimately define at a high level intelligent automation. At a more detailed level what does intelligent automation look like? We stared putting together a framework that we evolved a little bit year over year. The first version of this was in that initial discussion in 2015 and now it's at a point where I think it's fairly mature. Mature enough that while we may change some of the labels and some of the orientation, that fundamentally this is the framework that's going to define this architecture for intelligent automation going into the next three to five years.

So incorporating RPA not as the platform but as a component, as a pillar of that platform. Similar AI and machine learning, as well as decision management and process management. Basically these four components existing as essentially equal and first class citizens of that platform. And using any vent pipeline to feed into that.

Nathaniel Palmer
Author: Nathaniel PalmerWebsite: http://bpm.com
VP and CTO
Rated as the #1 Most Influential Thought Leader in Business Process Management (BPM) by independent research, Nathaniel Palmer is recognized as one of the early originators of BPM, and has led the design for some of the industry’s largest-scale and most complex projects involving investments of $200 Million or more. Today he is the Editor-in-Chief of BPM.com, as well as the Executive Director of the Workflow Management Coalition, as well as VP and CTO of BPM, Inc. Previously he had been the BPM Practice Director of SRA International, and prior to that Director, Business Consulting for Perot Systems Corp, as well as spent over a decade with Delphi Group serving as VP and CTO. He frequently tops the lists of the most recognized names in his field, and was the first individual named as Laureate in Workflow. Nathaniel has authored or co-authored a dozen books on process innovation and business transformation, including “Intelligent BPM” (2013), “How Knowledge Workers Get Things Done” (2012), “Social BPM” (2011), “Mastering the Unpredictable” (2008) which reached #2 on the Amazon.com Best Seller’s List, “Excellence in Practice” (2007), “Encyclopedia of Database Systems” (2007) and “The X-Economy” (2001). He has been featured in numerous media ranging from Fortune to The New York Times to National Public Radio. Nathaniel holds a DISCO Secret Clearance as well as a Position of Trust with in the U.S. federal government.