BPM.com
  1. Peter Schooff
  2. BPM Discussions
  3. Tuesday, March 07 2017, 09:50 AM
  4.  Subscribe via email
Integrating with other departments and other technologies, would you say integration is as important as efficiency for today's processes?
Eyal Katz Accepted Answer
Efficiency is more of a goal. For example, your goal can be to better implement software integration and increase efficiency. However, the integration itself cannot be a goal as much as it is a tool used to achieve a particular goal (like improving efficiency).

Or maybe i misread the Q...
Comment
  1. more than a month ago
  2. BPM Discussions
  3. # 1
Both are important.
If you don't put a proper focus on efficiency, why take pains to have processes provide orchestration?
If you see your BPMs as an island, with no way to import/export data, you are working in "never-never" land.
The primary role of processes is to provide orchestration (and some governance) toward meeting Case objectives.
Comment
  1. more than a month ago
  2. BPM Discussions
  3. # 2
Emiel Kelly Accepted Answer
Weird question. Efficiency is a thing that you try to achieve. Integration is a thing you do.

Having said that, I think Integration has effectiveness as a goal. (getting the process done by useful connections between things and actors)

And you know; effectiveness comes before efficiency in the dictionary.
Sharing my adventures in Process World via Procesje.nl
Comment
  1. more than a month ago
  2. BPM Discussions
  3. # 3
Michal Rykiert Accepted Answer
In my opinion integration is a way to increase efficiency of a process. I see it in three major dimensions:
- getting data from external systems - e.g. for auto-population of fields and taking advantage of existing information
- inserting data into external systems - to eliminate data re-entry
- unified interface for all business process - employees don't need to care where data comes from and where it lands eventually (e.g. in ERP system), as long as the goal of the process is achieved. Plus if there's just one interface, it's much easier to get new employees on board.
http://ocrwdokumentach.pl/wp-content/uploads/2017/02/WEBCON-logo-mini1.png
Comment
@Michal ... Good points re integration - as we start seeing more IoT devices, providing these can export their data in some reasonable format, data transport mechanisms (preferably using push instead of pull) can bring data to generic data exchangers for pickup by subscribers to data.

The practical approach is to allow publishers/subscribers alike, each export and import data using their own native data element naming conventions. The impractical approach is to have all application systems align their exports to some "standard" set of data element names.

Few seem to be paying attention to the fact that in a data set comprising 10,000 data values, any one subscriber may only be interested in, say, 1,500 of these. In industry application areas (i.e. healthcare) where inadvertent disclosure attracts heavy fines) it seems prudent to share information on a strict need-to-know basis only.

A generic data exchanger can handle both data mapping and data access.
  1. Karl Walter Keirstead
  2. 4 weeks ago
  1. more than a month ago
  2. BPM Discussions
  3. # 4
Garth Knudson Accepted Answer
Blog Writer
Good answers above.

Integration leads to greater efficiency and effectiveness. With BPM moving toward digital transformation, integration is a key component to creating a more streamlined and comprehensive customer experience. Integration creates greater awareness and engagement.
Comment
  1. more than a month ago
  2. BPM Discussions
  3. # 5
John Reynolds Accepted Answer
Integration-ability is the key metric of process-agility.
If the architecture of the company's business software provides fine-grained integrations to transactions and information, then crafting efficient processes (and evolving them when necessary) is relatively easy. If the integrations are course-grained, then you're hosed.

Think about every "rip-and-replace" project you've ever worked on. Your dreams of implementing a better process were held hostage to that tightly coupled monstrosity that you were replacing. Had the previous generation been architect-ed as a loosely coupled composition, you'd have been able to replace it in a logical iterative manner - Instead you ended up with a waterfall death march.
Comment
Yyyyep.
  1. Patrick Lujan
  2. 1 month ago
fully agree, I'd say that in most cases however that "tightly coupled monstrosity" was ultimately proven to be not a technical system, but a human system (team, department, incentive system, wrong objectives, rituals, dark processes etc)
  1. Bogdan Nafornita
  2. 1 month ago
  1. more than a month ago
  2. BPM Discussions
  3. # 6
Boris Zinchenko Accepted Answer
Integration of processes is the primary mission of any BPM initiative. Disintegrated processes are inherently poorly coordinated and not possible to align. Therefore, efficiency of processes can be achieved only as a logical consequence of process integration. Integration is an essential precondition to achieve process efficiency. Both these aspects of processes naturally coexist and cannot be considered standalone.

Role of process integration systematically increases in BPM systems over past years. BPM systematically turns into a technology of integration for all other systems and processes in an organization. Growing integration and process exchange capabilities of modern BPM systems become key competitive advantage. We can see how external process scripting, unified service standards, compliant API and interchangeability of process blocks become imperative for robust and efficient enterprise architecture.
Comment
The analysis presented here is implicitly statistical and research-based, it seems. Looking at a large population of enterprises we see each enterprise in turn comprised of sets of naturally occurring integrated or non-integrated processes. Alignment is required for process efficiency; without alignment, inter-process friction is very high and very expensive. Therefore, by definition, process efficiency depends on integration. (This is a formal way of stating the rule I mention below, such that for business processes "business value increases with increasing integration"). One might say this is stating the obvious, i.e. that a lack of coordination is expensive. In reality though the need and benefit for integration is not so obvious; a formalism makes it possible to highlight the need for integration as a first-class citizen and top priority of management efforts.
  1. John Morris
  2. 1 month ago
@John, thank you for comment. These are merely my personal impressions. Integration is a strange topic. Hardly any other term is mentioned so often in EA projects, although everybody understands it in very own way. Even more strange, integration is far less associated with BPM than it should be.
  1. Boris Zinchenko
  2. 1 month ago
  1. more than a month ago
  2. BPM Discussions
  3. # 7
By definition, any process is an assembly of many other artefacts, such as data, documents, events, rules, rules, services, processes, audit trails and KPIs.

Thus integrability is essential.

Thanks,
AS
Comment
  1. more than a month ago
  2. BPM Discussions
  3. # 8
So long as everyone uses terminology consistently, everything dovetails properly..

Mission -> strategy -> initiatives/ROIs | Cases -> objectives -> goals -> processes -> steps

Effectiveness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . Efficiency


Any dissenters re this?
References
  1. http://www.kwkeirstead.wordpress.com
Comment
LOL @Karl ("with you, for sure") concerning a "consistent terminology" for programme, project and process management! Sure it's "splitting hairs", but it's possible to find in the literature references to goals and objectives which are completely the opposite of each other.
The majority holds, as you have listed above, that "objectives" are higher level concepts that "goals". However there is a minority view (including it seems, @Alexander), that takes the opposite stand. I also hold with the second group, and interestingly the newly fashionable "OKR" ("Objectives and Key Results") methodology that has been increasingly adopted in Silicon Valley (e.g. Google) can be said to have Goals as higher level than Objectives.
OKR REFERENCES
http://okrbasics.com/the-difference-between-objectives-key-results-and-goal-setting/
https://www.infoq.com/articles/agile-goals-okr
https://www.slideshare.net/Atiim/best-practices-of-okr-goals-56282760
http://leanperformance.com/en/okr/what-is-okr/
I like this approach a lot (we're currently implementing it), because it emphasizes the difference between "work" (or "task") and "outcome". This is my justification for caring about this terminology; it's surprising and disappointing when project success is impaired because goals and objectives are confused.
My interpretation of our terms is, using a sort of systems language, is this:
-----OKR-----------------------
GOAL -- Desired system state
OBJECTIVE -- The output of a project, intended to change the state of a target system.
KEY RESULT -- Measurement of success of objective
------PROJECTS-------------
PROJECT OR PROCESS -- Plan for and management of work leading to above Objective
  1. John Morris
  2. 1 month ago
@Alexander I have worked with Dr Ackhoff's (1970) - "A concept of corporate planning" page 23 since around 1970.

"States or outcomes of behavior that are desired are objectives"
"Goals are objectives whose attainment is desired by a specified time within the period covered by the plan"

Looks like he defines a goal as a particular type of objective so now that you comment on the order, not sure which one should go first.
  1. Karl Walter Keirstead
  2. 1 month ago
As far as I remember BMM, goals should be before objectives in your chain.
  1. Dr Alexander Samarin
  2. 1 month ago
LOL @Karl "Lewis Carroll" . . .
  1. John Morris
  2. 1 month ago
OK, I will quit on this one. I suppose we all are entitled to our "alternative facts" - I don't fancy having to re-publish 300 blog posts so I will stay with Russell Ackhoff.

I liked, by the way, his definition of planning " Planning is the design of a desired future and of effective ways of bringing it about"

I do have other trusted sources of wisdom I go to from time to time i.e. Lewis Carroll
  1. Karl Walter Keirstead
  2. 1 month ago
@John, I call it "results chain" - see one from the development business http://improving-bpm-systems.blogspot.ch/2013/03/result-based-logical-framework.html
  1. Dr Alexander Samarin
  2. 1 month ago
  1. more than a month ago
  2. BPM Discussions
  3. # 9
E Scott Menter Accepted Answer
Blog Writer
Efficiency is like the tax credit you receive when you have a child: it's a nice little extra, but not, in and of itself, the point of embarking on the adventure in the first place. (That is, unless you define efficiency so broadly that it includes virtually any observable improvement.)

Rather, it would be more useful to view automation as a way to unite multiple systems of record within a system of engagement that brings together the people, data, rules, and processes comprising a business. If that's what you mean by “integration”, then I guess I'd say that integration is all that's important. All other benefits arise from doing that part right.
http://www.bplogix.com/images/icon-x-medium.png Scott
Comment
  1. more than a month ago
  2. BPM Discussions
  3. # 10
John Morris Accepted Answer
Integration is often a project killer, which means an "efficiency of zero". Why? Because of all the technical debt associated with the business semantics we need and which is encapsulated behind a given interface. Consider that badly done integration is sort of like the modern equivalent of the dreaded GOTO statements which were the bane of code before structured programming. The result is often incomprehensibility. (The difference between the two is that a GOTO is inexpensive compared to an integration point. Integration is expensive in part because of the difficulty in finding affordable people who know a given interface.)

A highly integratable environment, based on a well-planned inventory of e.g. microservices would be great. But where has there ever been a business case for such a foundation? Except in a tiny minority of outlier firms? An excellent example of integration is EDI, which is expensive, complex, brittle -- and the ante to play in many markets. One might ask "is there any other way": Maybe integration hell is inescapable. Maybe this is another example of technology market failure.

Here's a paradox where business process integration is concerned: "Business value increases with increasing integration" but that "Software project risk increases with increasing integration". Someone should do a graph. Or a PhD thesis.:D
Comment
+1 @Boris for your reference to MDM, the "elephant in the room", which is the source of many challenges where integration is concerned.
  1. John Morris
  2. 1 month ago
+1 @John, very, very valid points. It explains why BPM projects are generally so difficult and so risky. However, the only alternative to integration is a growing uncontrolled complexity, which can well bury any company. Therefore, with all this in mind, we still move in direction of gradual integration initiatives, although much slower than desired. Even if true run-time integration of systems is objectively impossible, there is still a space for consistent MDM and process mapping to get overall vision of processes and semi-manual coordination.
  1. Boris Zinchenko
  2. 1 month ago
Quick reply Karl -- I hope to return another time to explore -- but I see the risk as semantic risk, i.e. complexity. My experience with integration projects underscores this perception.
  1. John Morris
  2. 1 month ago

... Still at it after all of these days. . .

@John
On “Software project risk increases with increasing integration”.

The background for this probably is that more implies more, but does that really matter when you have lots of storage, bandwidth and processing horsepower?

My theory is application developers start off on the wrong foot when they fail to auto-export their data.

Exports in ANY format are better than no exports, even if you have no data sharing in mind, because the act of exporting gives you free backup.

Now, one step beyond, where you parse your exports so that you can post data to a generic data exchanger, brings you to a world of relatively easy sharing (providing the architecture of your data exchanger is right).

How complex IS data sharing?

A good data sharing architecture, IMO, models a marketplace where you have publishers and subscribers. They meet, compare wares, agree on terms, and both sides go away happy.

A publisher decides what data it is willing to share and with whom. A prudent approach is "need-to-know" so the finding under the hood is that one cluster of published data is widely shared whereas another is very narrowly shared.

Next, a publisher likes to publish using ITS own native data element naming conventions.

Subscribers like to read using THEIR own native data element naming conventions. (i.e. I publish "abc", you want to read that as "def", another wants to read your "abc" as "ghi").

All we have left to worry about, taking this approach, is pickup, transport and data import at the subscriber local or remote system or application.

I suppose we could itemize integration issues that result in "Software project risk increases with increasing integration" .

A good starting point, with less dramatization than in a recent blog post "What you don't know will hurt you" http://wp.me/pzzpB-Oz, could simply declare that it is a subscriber’s job to know what they want/need and all they need to do is go cap-in-hand and ask for it.

Once registered with the publisher for a particular data cluster, a subscriber can carry out surprise-free data pickup and if the subscriber discovers at some stage that a previously available data item is no longer available, the subscriber import routine will surely find out about this.

With the registration/sharing process I have described, posting of new data items by the publisher does not result in a fail.

So, no increase in software project risk here.

Compare this with the traditional soup-and-nuts data streams that many publishers continue to post where you are constantly subject to increased software project risk as and when the data transport format changes.

The data exchanger model works differently.

Here, it is the subscriber who extracts the data and then either uses a standard data transport format or uses a subscriber data transport format designed by the subscriber.

Final point... nowhere in the above does it say you cannot have a very large number of publishers, and subscribers, nor that a subscriber cannot also be a publisher. And, we don't care that much about the amount of data when the sharing is done on a need-to-know basis.

Clearly parsing to post to the data exchanger and formatting to extract data from the data exchanger is work but with a large enough marketplace and a growing inventory of parsers/formatters, developers soon tire of inventing new ones for no good reason.

As Alexander often reminds us, most computing troubles begin with bad architectures.
  1. Karl Walter Keirstead
  2. 1 month ago
  1. more than a month ago
  2. BPM Discussions
  3. # 11
John Robinson Accepted Answer
I like what Garth Knudson Michal Rykiert have to say, as I also enjoy the candor of Emiel Kelly.

Efficiency is at least in-part (sometimes in large part) tied to the ability to bring data to one place (into the process) to complete a business transaction. Many BPM customers have moved beyond the simple stages of adopting BPM and are now trying to understand how to leverage it for true digital transformation - meaning complexity is growing in the processes and integrations. There were some great points above in regards to the reality of many customer's integration capabilities (i.e. EDI, batch, web services) and the reality is that the majority of company's don't have easily consumable integrations (ideals that SOA or micro-services promise) - there is a lot of integration technical debt out there...albeit strategy and intention were there.

Many times we must integrate BPM with systems that are built to service a business function (i.e. Finance, Order Mgmt, Customer Service, Shipping, etc etc) with arcane integrations, it's a good thing BPM tools can handle this diverse data supplier landscape (directly, via broker or bus). Each of the functional systems serves their own audience very well, but try to address an end-to-end business transaction and you begin to see why integrations are critical to efficiency.
References
  1. http://www.capbpm.com/
Comment
  1. more than a month ago
  2. BPM Discussions
  3. # 12
David Chassels Accepted Answer
Efficiency with adaptive capability in processes is a certainty and will be an on going exercise with direct input from users. Integration has as indicated two aspects.

One of the problems that legacy has created is the "silo" structures in organisations yet end to end processes very often need to cross such silos and this is where BPM can start to address these deficiencies. Indeed where there are important value networks such departmental integration is essential.

The other aspect of integration with "other technologies" is not the right way to think. The minute you open up a legacy to try and alter to change delivery not only will cost rocket but likely create a new inflexible legacy system. The outside in BPM approach with a good supporting software platform should be the self contained master of the operational processes. Legacy becomes the slave used as required by the process. Think of a greenfield of flexible processes surrounding the brownfield of legacy. It should be communication with legacy not integration. Yes duplication of information may result in short term but this could allow planned retirement of old legacy systems.

Understanding such aspects of addressing business operational process are all important.
Comment
  1. more than a month ago
  2. BPM Discussions
  3. # 13
Wow.. Lots of diversity re "integration"

Response
"integration is a way to increase efficiency of a process . . . employees don't need to care where data comes from and where it lands"

>>agree

Response
"integration is a key component to creating a more streamlined and comprehensive customer experience"

>> agree, companies like Lockheed need communication to/from their many suppliers, the data they ask for and get is the data that the contracts spell out at a contractually-agreed-upon periodicity.

Response
"crafting efficient processes . . . is relatively easy. If the integrations are course-grained, then you're hosed"

>> agree . . . data, data, data

Then, from @Boris . .

Response
"Integration of processes is the primary mission of any BPM initiative. Disintegrated processes are inherently purely coordinated and not possible to align . . . efficiency of processes can be achieved only as a logical consequence of process integration"

>> why no reference to data? Answer . . . if you read carefully, @Boris is talking about "efficiency of processes" not "efficiency for processes".

>> the way I interpret the last response is yes for internal processes but no for Lockheed supplier processes. The suppliers don't want Lockheed breathing down their necks and if Lockheed spends enough time on the contract drafting, they will get the sub-set of data they need for CRM/CEM. (or supplier relationship/experience management).

I do get "Integration of processes is the primary mission of any BPM initiative", with the caveat "in the context of this Question\Response discussion".

If we isolate the statement then I would claim the primary mission of BPM is to achieve efficiency in the performance of work and then, one level up, achieve effectiveness via Case (where BPM and a few other methodologies are core to the run-time environment hosting the work).
References
  1. http://www.kwkeirstead.wordpress.com
Comment
@Karl, thank you for comments on my reply. I have put accent on processes because they are directly a subject of this question. Processes have distinct difference from data. Data represent a static view of an IT system, while processes express system dynamics. Dynamics is substantially more complex than the data, on which it relies. Quality data does not necessary mean quality processes built upon them and efficient system integration. Integration of data is necessary but not sufficient condition for integration of processes. Efficiency of processes is necessary but not sufficient condition for effectiveness of a business.
  1. Boris Zinchenko
  2. 1 month ago
@Karl -- What do you think the idea that the "primary mission of BPM is to achieve efficiency in the performance of work " could be added to the "Meaning Of BPM Technology For Organizations"? The "Meaning" item (the second box) is supplemental to the "Minimum Viable Definition of BPM Software Technology". The "Meaning" definition leaves BPM as a way to automate -- which is correct certainly.
But to go UP a level from technology to business implications, we go from "automation" to "efficiency". The Paper takes automation for efficiency as a given (all things being equal). It would be worth adding the specific mention of "efficiency" to the "meaning"; that would then speak more directly to business leaders.
http://bpm.com/bpm-today/blogs/1167-minimum-viable-definition-of-bpm-technology-part-2-the-bpm-core
  1. John Morris
  2. 1 month ago
@Boris.. Seems the Civerex architecture\methods do not require any discussions re data vs procedures. My developers say they totally avoid stored procedures.

Our customers tend to have 100+ data display/data collection forms, typically with 50 "fields" per form. When customers design forms in our environment they never go near the underlying data structure (database ops are totally automated).

I doubt any have a clue what tables/fields are in the empty RDBMS' that we distribute with our Case Mgt systems.
  1. Karl Walter Keirstead
  2. 1 month ago

@John.. Glad you like it.

Aside from the goal/objective debate, I feel it important to make the point that the transition from efficiency to effectiveness (right to left) or from effectiveness to efficiency (left to right) takes place where we start assessing progress toward meeting Case objectives.


Mission -> strategy -> initiatives/ROIs | Cases -> objectives -> goals -> processes -> steps

Effectiveness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . Efficiency


Without end-to-end processes, objectives can only be stored run-time side - there's just no other place to park them. (ref my FOMM rants)

Mapping of processes is what we all do plan side, Case Management is what we all do at run-time
  1. Karl Walter Keirstead
  2. 1 month ago
+1 @Karl: "primary mission of BPM is to achieve efficiency in the performance of work "
  1. John Morris
  2. 1 month ago

@Boris... Regarding "Data represent a static view of an IT system" -

Realizing that the destination for data coming into a Case is a RDBMS, how about we declare data at any Case to be static only for completed/committed process steps?

Then, for steps that are current or foreword steps, we view the data (never mind where it is actually physically stored) as flowing along pathways - does that not make the data dynamic? (i.e. as any step becomes current and you click on any of the attribute forms, you see whatever data is in the Case at that/those forms)

We have a home healthcare services app where a caregiver does a 2 -hour shift and is then replaced by a 2nd caregiver. Naturally, the incoming caregiver has to be able to view the clinical notes of his/her predecessor which means the 1st person has to write up and upload progress notes, typically with a 5-minute turnaround, the replacement is able to view the most recent progress note and augments the data set with a new progress note.

For the particular example cited, the data entry is such that it can be done on smartphones (no long-winded narrative commentary needed), so the data entry does not take much time.

Do you agree (excluding data mining) that the above approach renders both process and data to be "dynamic"?

I should add that semi-real-time data (flowing in via a data exchanger from various local and remote systems and applications) can also be viewed as close enough to "dynamic" (it's up to publishers and subscribers to agree on the needed frequency of data export/import).

All we need for this is to import/store, have key process gatekeeper step rules on a cycle timer look for the presence/absence of data and either block or enable downstream processing. (i.e don't allow forward processing until the replacement part on order arrives at the plant).
  1. Karl Walter Keirstead
  2. 1 month ago
@Karl, given present level of aggregation of data and procedures in DBMS systems, it is not so easy to distinguish both. I understand data as an information and procedures (processes) as operations over this information. In this sense, e.g. SQL table content represents data, while stored procedure residing in the same database is a process executed over these data and triggered by some condition. It doesn't matter, if data are stored or not, if it is real-time or resident, distinction between data and procedures in this sense is, as a rule, quite evident.
  1. Boris Zinchenko
  2. 1 month ago
  1. more than a month ago
  2. BPM Discussions
  3. # 14
... Still at it after all of these days. . .

@John
On “Software project risk increases with increasing integration”.

The background for this probably is that more implies more, but does that really matter when you have lots of storage, bandwidth and processing horsepower?

My theory is application developers start off on the wrong foot when they fail to auto-export their data.

Exports in ANY format are better than no exports, even if you have no data sharing in mind, because the act of exporting gives you free backup.

Now, one step beyond, where you parse your exports so that you can post data to a generic data exchanger, brings you to a world of relatively easy sharing (providing the architecture of your data exchanger is right).

How complex IS data sharing?

A good data sharing architecture, IMO, models a marketplace where you have publishers and subscribers. They meet, compare wares, agree on terms, and both sides go away happy.

A publisher decides what data it is willing to share and with whom. A prudent approach is "need-to-know" so the finding under the hood is that one cluster of published data is widely shared whereas another is very narrowly shared.

Next, a publisher likes to publish using ITS own native data element naming conventions.

And, subscribers like to read using THEIR own native data element naming conventions. i.e. I publish "abc", you want to read that as "def", another subscriber wants to read my "abc" as "ghi".

All we have left to worry about, taking this approach, is pickup, transport and data import at the subscriber local or remote system or application.

I suppose we could itemize integration issues that result in "Software project risk increases with increasing integration" .

A good starting point, with less dramatization than in a recent blog post "What you don't know will hurt you" http://wp.me/pzzpB-Oz, could simply declare that it is a subscriber’s job to know what they want/need and all they need to do is go cap-in-hand and ask for it.

Once registered with the publisher for a particular data cluster, a subscriber can carry out surprise-free data pickup and if the subscriber discovers at some stage that a previously available data item is no longer available, the subscriber import routine will surely find out about this.

With the registration/sharing process I have described, posting of new data items by the publisher does not result in a fail.

So, no increase in software project risk here.

Compare this with the traditional soup-and-nuts data streams that many publishers continue to post where you are constantly subject to increased software project risk as and when the data transport format changes.

The data exchanger model works differently.

Here, it is the subscriber who extracts the data and then either uses a standard data transport format or uses a subscriber data transport format designed by the subscriber.

Final point... nowhere in the above does it say you cannot have a very large number of publishers, and subscribers, nor that a subscriber cannot also be a publisher. And, we don't care that much about the amount of data when the sharing is done on a need-to-know basis.

Clearly parsing to post to the data exchanger and formatting to extract data from the data exchanger is work but with a large enough marketplace and a growing inventory of parsers/formatters, developers soon tire of inventing new ones for no good reason.

As Alexander often reminds us, most computing troubles begin with bad architectures.
Comment
  1. more than a month ago
  2. BPM Discussions
  3. # 15
  • Page :
  • 1


There are no replies made for this post yet.
However, you are not allowed to reply to this post.