So in your experience, how do most enterprises fail with BPM and analytics?
I typically see three types of mistake, starting analysis too late, not focusing on the key metrics and not choosing the right tool.
The biggest mistake that I see organisations and consultants make is not starting the reporting analysis early enough, it's always left till last and therefore gets squeezed. You need to start the analysis up front; who wants it, who needs it, and for what purpose will it be used, operational or strategic.
The next is area not focusing on the vital few metrics that are important or having a clear idea of the metrics at all. I always advise start off with a few key reports and learn from experience what you really need and feed these requirements into the next release. Make sure you rationalise the reporting and focus on the vital few reports and not the trivial many.
The final area I have witnessed and struggled with myself is distinguishing between strategic and operational reporting. Sandy mentions some key reporting measures such as, Work backlog breakdown, SLA status or Resource pool and capacity these are what I term operational reports and typically they can be produced by the BPM solution and are usually provided as out of the box reports by many vendors.
However, organisations often specify what I call strategic requirements, reports that typically require augmentation with other data or special manipulation like, comparison over time for example, and in my view these types of requirements should be met by an enterprise analytics tool or data warehouse. The BPM solution can be one source for these strategic views but typically it is not the single source, enterprise tools are typically already on the estate, typically have the other data feeds and are engineered from the ground up to manipulate significant amounts of data and perform sophisticated analysis. Both operational and strategic views are important to process improvement, it's just who should produce them and where. Hence the need to start your analysis early on!
Given the following requirements of process strength/quality:
- improving productivity and efficiency
- align to strategy and goals
Without analytics, or the intrinsic demand for metrics/reporting on measuring process quality, we're left with yet-another BPM "application". Meaning that an IT department built an application, not process automation/management, via BPM-tools (automation suite).
Though building an application out of BPM-tools isn't such a bad thing... A good start - yes and no.
BPM as an application? YES (painful... but a start non-the-less): The IT-department is learning the tools and typically doesn't yet fully understand organizational change as a both a desire measure of BPM success.
And NO - BPM as an application takes an organization towards a non-productive (less productive) conclusion. BPM-as-an-application (without reporting) is both a non sequitur to the BPM argument and taking "process" in the wrong direction.
Good reporting/analytics supports linear analysis (prediction based on defined changed). Without a measure of progress... well, there isn't any. And, refactoring a BPM-app into "managed process" only demonstrates the importance of starting any BPM project with clear change-minded goals.
For those nasayers professing the impossibility of early reporting without process... Well, where does a process start without a good story (agile) and/or narrative? The road goes nowhere without a beginning. A good map and charted course works as a metaphor: analytics/reporting is your starting "here" and getting "there".
1. WORK OF ANALYSIS: Probably 3/4 of "data scientist" time is spent wrangling data -- i.e. just making sure that data sets are available and semantically matched and that there is enough memory available for analysis etc. etc. (I speak from personal experience, although the tools I had to work with into the 90's were a far cry from the marvellous tools available today.) The work of analysis requires specialized skills, thus the pay rates of "data scientists". If you're not prepared to pay for the work, you won't have good analysis. But it will be hard to tell that it's wrong.
2. DOMAIN UNDERSTANDING AND MODELLING: Usually captured under the headline of "KPIs" (a start but really insufficient), the whole value of data analysis is dependent on the domain knowledge you bring to the activity. This domain knowledge is usually instantiated as a "model". There are lots of examples of laughable correlations that show how data "doesn't analyze itself". But I don't see organizations generally stepping up their modeling and simulation activities, and capturing their business semantics, which is where the true value of big data and analysis is revealed.
3. PROCESS MINING: TU/e and Prof. Wil van der Aalst (yes, *the" Prof. van der Aalst, of process research fame) is starting a "process mining" course today (Nov 11, 2014, interesting opportunity for any reader -- there are already thousands of attendees signed up via Coursera!). And process mining gets to the exact space between process execution and process modeling. Process instances potentially throw off so much data that you need to have systematic ways of extracting value from the activities.
From the sales perspective that is my world, I ask the following question: How can an analysis function and/or analysis project be justified as part of a full-on BPM program? Because anything less than full commitment will likely result in spurious correlations and information overload. There are so many "anti-patterns" where the mashup of BPM and analysis are concerned.
Any my answer is "there needs to be a business case" for the analysis. A "use case" isn't enough. If analysis is just considered to be a "feature of a BPM project", that's a use case. In such circumstances, analysis will be underfunded, and usually is. And analysis is an "infrastructure business case", i.e. requiring the most senior level of commitment.
Why bother making an analysis commitment as part of a BPM program?
A good question, but still a ludicrous question. Without analysis, you will manage your BPM program "blind". In fact you'll still have data and reports of course. Because someone will knock off a report or "pull some data". But it won't be systematic. It will be like managing your cash without a proper accounting system. It can be done. But not well. BPM is *the* technology of work. You need to manage your work, and that's done when you can see and examine your work. Put like this it's amazing that any BPM program isn't deployed with a full analytics program support.
I think it must be process-centric analytics, not BPM and analytics. At present, the situation looks like:
- Analytics, loudly: “Wow, BPM you have a lot of big data, I can visualise them in many different ways!”
- BPM, internally: “Maybe we should ask the business process owner how to help him/her”?
So, what is the main concern of many process owners? Sandy mentioned it briefly “there’s also value in doing an in situ simulation to predict the impacts of these actions on the SLAs or costs.” This is a tool for proactive (predictive) management (like a GPS navigator) for many business process owners. Example - [url=http://improving-bpm-systems.blogspot.ch/2010/03/linkedin-how-do-we-measure-work-flow.html]http://improving-bpm-systems.blogspot.ch/2010/03/linkedin-how-do-we-measure-work-flow.html[/url]
My only observation would be that next generation digital applications will be driven by "BPM" principles and will create real time "analytics". So the future should deliver good information and failures will be management ones not disjointed software challenges that seem to exist at the moment?
So when the computer came along everything was designed to appeal to them. Rather selfish, really.
BPM, like Six Sigma and Lean before it, was about the management team, with the help of a consultant, designing a process to impose on the workforce.
Analytics created spreadsheets and graphs for the same management team to feel they were in control.
But everyone missed the real opportunity computers created. To generate live data and feed it back into the system so that the process learned, adapted and improved continually.
This is the principle behind machine learning, which is transforming many fields. We aren’t there yet with BPM, but we can create a half-way house which is much better than BPM plus Analytics.
In a Dynamic Social Process, transaction data is fed back to the people involved in the process. They can see…
How long each part of the process takes
How long in handover to the next stage
How many transactions follow an exception path
How much resource is used
Fluctuations and how these affect efficiency
As a visual management tool it is effective. But it gets better.
If people are then encouraged to crowdsource ideas for improvement in a Lean Startup re-iteration approach, everyone becomes engaged in making the numbers better. They can take the biggest problem and reduce it each week, or they can focus on a particular area of concern – perhaps returns, or latency. That involves staff, making them a valuable part of the change process and it ensures a wider pool of customer feedback, change possibilities and it creates teamwork which breaks down silos and opens new possibilities.
It leads to a rapid increase in the efficiency AND effectiveness of the process and a dramatic rise in both user and customer satisfaction.
But the question was how do they fail?
By making managers a point of failure – they get the intelligence but haven’t their hands on the levers to take action.
By failing to get management out of the way and engage everyone in interpreting the data to give a 3D view.
By defining analytics as pretty graphs for managers to help them mind their butt.
+44 (0) 1491 874368
+44 (0) 7590 677232
The main problem is that the analytics mostly deal only with process data delivered by the BPM system and that is kind of the least important detail. The important data are the business data and moreover the customer satisfaction data which are in most cases some place else and can't be connected. if the right data would be collected and then properly processed then it would become clear quite quickly how bad BPM is for customer service and how little money it actually saves, if any. The long-term damage could only be seen in financial company statements that no one wants to connect in this way. The problem is that the benefit of process management being done right are also not short-term gains. Saving elapsed time in process statistics could for example mean that the time to service a customer was severely cut and with it the service quality.
One can't manage a business with process data ...
In case I was not very clear by now :), you all know I judge business process from a very broad perspective: any logical (not necessarily chrono-logical!) sequence of actions undertaken to achieve a business goal is a business process. So to me prescriptive vs descriptive are just facets of the same concept, but I know most people still think BPM is about prescribing optimal process flows until that nice little End event kills that last process token and everything goes to nirvana (which is also when the general ledger balances even out, right?).
Even now as I am watching the Stanford CS class called "How to Start a Startup" all those smart, famous and rich people explain how they do not believe so much in process but then they put out there a lot of tips, tricks and advice that, to me, is just nuggets of business process. "Oh, I am not a process guy/girl, so instead of doing process, I do a weekly meeting and split the dev log into focused single tasks and do a 1-on-1 every 2 weeks and go out there and take customer calls and feed this back to engineering..." So, which of these is not actually part of a business process, really?
But enough of my prescriptive/descriptive dichotomy irrelevancy rant.
The big mistake companies do when mixing process and analytics (when they do it, which is in 0.1% of the cases, probably a bit more in Case Management environments) is that they don't do it in an integrated manner. So it is really just a tasteless mix, no blending, no common reinforcing mechanism. Just process and data, maybe the stew tastes good, right? Wrong.
Look, it's not actually their fault - let's not look further than our own beloved BPMN 2.0, which still treats data as a frigging artifact (which means of equal importance to annotations and groups)! And, according to the spec, that data usually doesn't survive the process instance because that's exactly what happens in real life (/sarcasm). So BPMS tools mostly ignore the data half of the equation, because the spec they comply with is shortsighted and incomplete. Same could be said about analytics tools, who gloriously ignore the underlying business processes, which makes data analysis a bitch. There's only so much you can understand from your data if it's not clear how it got generated by the business behaviors. Right now, as John said already, the data scientist job is mainly about ensuring data integrity and perhaps a bit of basic statistical / econometric analysis, not real business analysis.
Process mining is important not only for process discovery, but also for reality checks. Real time process mining shows you what stupid decisions you took at design time in your BPMS. But I see no real barrier to linking process data and logs to actual business data to make the most of the decision-making opportunities. The first person to nail this will be a very rich person.
- Page :
However, you are not allowed to reply to this post.
Join the Discussion
Also on BPM
- Intelligent Automation: How Robots And AI Are Redefining The Rules
- Puppy Dogs and AI: Training Automation for Relevance
- The Future of Business is Now: Appian's Malcolm Ross Makes the Case for Intelligent Automation
- The Year Ahead for BPM -- 2019 Predictions from Top Influencers
- Winners Announced in 2018 WfMC Global Awards for Excellence in Business Transformation
- Overcoming Goliath: The New Nimbleness of BPM and Workflow
- Taking Processes to the Next Level: ProcessMaker Discusses the Ever Evolving Nature of Today's Work