BPM.com
  1. Peter Schooff
  2. Sherlock Holmes
  3. BPM Discussions
  4. Thursday, 24 August 2017
  5.  Subscribe via email
From David Chassels: How important do you think the cloud is to the future of processes?
Accepted Answer Pending Moderation
Pretty important, for two reasons, one practical, one functional.

The first is, of course, infrastructure. Economy of scale and all that, a lot of orgs are and have been moving their infra to the cloud and more will continue to do so in coming years as things mature (e.g.- security). Less money spent on infra means more money to spend on the good stuff.

On the functional side, depending on hosts (e.g. - AWS, Azure, etc) we'll start seeing more and more businesses interact across an entire chain over that infrastructure moving and exchanging information at faster speeds which, in turn, will hopefully provide the opportunity for process improvement as well. It should. Early adopters who get out in front of this will have an advantage.

Just my tuppence.
Comment
  1. more than a month ago
  2. BPM Discussions
  3. # 1
Accepted Answer Pending Moderation
Cloud is crucial for the future of BPM - on the level of BPMS as well as BPM as a discipline. A good example in that direction is Flokzu, for instance, having their whole suite in the cloud making not only the design (which many other providers offer in the cloud too - such as IBM with Blueworks live) but also the implementation and the execution an absolute breeze.
As Patrick mentions, many important players in the field are aligned in offering their platforms to be hosted on premises, private and public clouds, and hybrids.
NSI Soluciones - ABPMP PTY
Comment
  1. more than a month ago
  2. BPM Discussions
  3. # 2
Brian Reale
Blog Writer
Accepted Answer Pending Moderation
Of course, it is obvious that cloud is here to stay. However, we are seeing a transition in Cloud from SaaS to Microservices.

SaaS is undergoing serious change. Most BPM vendors and other enterprise software vendors got pushed into offering a SaaS commercial model because of the wave of pressure set off by SalesForce. So most BPM vendors now offer the same tired model as SalesForce. I say "tired" because the SalesForce model will soon be two decades old! And you won't find too many people happy about paying $150 per user per month to SalesForce. SalesForce can do it because of their incredible connector ecosystem. Nobody seems able to replicate it.

But this world is changing thanks to AWS and what I call "Micro Value Compute Economics." The long tail has extended so much further than what we could have imagined. Improvements in metering, billing, and services, mean that we are now entering the true age of micro-billing and micro-consumption. I have written extensively about this in my blog https://blog.processmaker.io/5-signs-that-the-bpm-software-model-is-broken/

Along with long tail micro-billing and micro-consumption comes, of course, microservices. The Microservices API-based economy is literally exploding. Just take a look at the graphs in the link below from nordic apis. It is inevitable that these trends will start having a major impact on every software category including BPM.

At ProcessMaker we have been working on this for quite some time now and recently released an Alpha of what we consider to be the first "always on" workflow engine API offered as a metered microservice. The solution - ProcessMaker I/O is a pure workflow API that can be used to orchestrate both human tasks and microservics. It can be easily embedded into any enterprise software project because it is only an API. You can import BPMN 2.0 files from your favorite modeler and we will run it. Or you can use the API programatically. ProcessMaker.io has SDKs in every major programming language, full support for BPMN 2.0, and a highly resilient and scalable API in the cloud that can scale to handle extremely large scale requirements. The API manages distribution of asynchronous tasks, queues, worker clusters, and more.

This product is not for everyone. It is for developers and ISVS building enterprise software. This is the type of product that is quickly becoming the backbone of the next generation of cloud products. We believe this is the future of cloud processes - always on, totally frictionless, and micro-billed.
References
  1. https://blog.processmaker.io/5-signs-that-the-bpm-software-model-is-broken/
  2. https://www.processmaker.io/
  3. https://nordicapis.com/tracking-the-growth-of-the-api-economy/
Comment
But wait--there's more! Order now and get a second microservice ABSOLUTELY FREE!*

*) Just pay separate shipping and handling.
well, I agree that microservices will be big.
I'm not so sure the API economy is enterprise-ready. It's such a quicksand that requires VERY extensive maintenance - and no customer is willing to pay for that.
I don't have hype graphs from analysts to show, just actual business cases from real customers (call it myopia): a customer asked me to integrate data from 3 SaaS monitoring solutions into a graph database and then expose that in a custom dashboard. After a short analysis, I came back to them - one of the SaaS tools had an API in beta, highly volatile. The second one didn't have an API. The third one has been acqui-hired by Amazon and stopped exposing their API. I told them I'm not even bothering making an offer :-)
And the overall landscape is even more worrying - the big internet giants are playing with their API access on a daily basis and randomly ban companies that use their data. Not to mention frequent changes in regulations (the fall of Safe Harbor Privacy, the rise of GDPR in EU, the data death grip of increasingly authoritarian regimes). Add to this mess the current tech fad of moving away from REST to GraphQL. Good luck maintaining that (and making money off it).
Of course, lots of APIs are being created - but how much of this trend is useful, persistent and reliable enough to make the case of an enterprise app?
  1. more than a month ago
  2. BPM Discussions
  3. # 3
Accepted Answer Pending Moderation
"Cloud" - just another fad that is reaching the peak of its hype cycle.

Larry Ellison 2008 "I think it's ludicrous that cloud computing is going to take over the world. But they get very excited about it, and it's this big echo chamber." According to Ellison, Oracle has no interest in building "the cloud" because "it's the Webvan of computing." Ouch.
Comment
My personal rule of thumb: if a tech billionaire says "X is going to be big", call your broker and short X immediately.
@E Scott: oh you mean like RPA and AI??
  1. John Morris
  2. 1 year ago
  3. #4424
Ken Olsen, founder and CEO of Digital Equipment Corporation (DEC) said in 1977 "Who needs a computer in their home?" Ironically, the company was absorbed by a PC company, Compaq, which was in turn absorbed by HP. (HP's systems businesses can be traced back to DEC.)
  1. more than a month ago
  2. BPM Discussions
  3. # 4
Accepted Answer Pending Moderation
Unimportant in our case - what difference really is there for a large organization running mission-critical apps with a need for high security, logging into their own server versus logging into an outsourced private server hosted by, say, RackSpace?

My customers like to use Citrix Receiver to gain access to Citrix StoreFront where they only get to see a desktop with icons we want them to see/use (typically a mapping icon and a run-time case management environment icon at an otherwise blank desktop). Select users see a few utilities that they can run.

Casual users (i.e. contractors, not part of ordinary staff) are declared to be "no-log-in" users.

These users do not get a desktop - the only access they have is Portal Access where they see a menu of "services" they can request, plus to-do-tasks that the central server apps push out to these users. The tasks come from a back-end task manager and any responses/data input at these tasks go back to the task manager. Portal Users can only record data at the forms that post with the tasks and then click on Submit. They have no means of directing data to other than the engine that pushed out the to-do-task they are working on.

The portal connects to an IIS server that is at arms-length from the back end server - an engine handles all requests going to and coming from the back end server.

All of this gives pretty-good security - two physically-separate servers with two protected engines that handle in/out traffic.

The advantage of this setup is that no casual user ever gets to establish a cursor position at the back-end server.

We have rules at the IIS/back end server engine that sniff out who is logging in, are they where they should be?. Is the request they are making part of the types of requests they are supposed to be making? etc.
Comment
  1. more than a month ago
  2. BPM Discussions
  3. # 5
Accepted Answer Pending Moderation
A couple of points.

Sure, it is important because we (people who care about processes) want to move away from the existing monolith or programmable monolith architecture of BPM-suite tools. Ideal scenario is the following:
- “hire” a perfect BPM-suite tool (which is installed somewhere in accordance with my security needs)
- define my process
- add my automation scripts as microservices
- execute

Thus you can concentrate on your process and its added-value.

Another point is slightly different. You, in your customer journey, have to execute a small task. Of course, such a task comes from a cloud-based BPM-suite tool. Thus, B2C and B2B processes will be better served by BPM in cloud. Of course, a proper super-reliable archive must be used (blockchain may help with this).

Thanks,
AS
Comment
  1. more than a month ago
  2. BPM Discussions
  3. # 6
Accepted Answer Pending Moderation
In our vision the cloud is an enabler to democratize processes and BPM, letting small and medium business access a discipline that was exclusive for big companies and government. Accessing BPM tools and services, let them take the most out of BPM, compete better, survive and grow. The cloud is a factor of equity when we talk about workflow and BPM technologies.

In a more tangible sense, cloud based BPM Suites are:

  • Simpler to install and use. SMBs have not the time, nor the resources to install, configure, administrate, corporate BPM Suites
  • Cheaper. SMBs don't have the budget to invest in a expensive solution.
  • Faster to start. Spending months to model and automating a process is not an option, because of the cost and managerial time. They need the process running in hours, maximum days.
  • Easier to exit. If the BPM initiative doesn't work, SMBs need to know that they can exit. If a huge time & investment was made in a Corporate BPM Suite, exiting is not an option.


As an illustrative example, I come from the corporate BPM World, with a leader product in Latam (INTEGRADOC) for government and big companies. A few years ago we expanded our offering with a cloud BPM Suite (SaaS) called FLOKZU, rapidly gaining traction and interest from companies all around the globe (+4000 subscribed organizations from +120 countries up to date). This only means that the market needed processes management technologies, and the cloud made them reachable.
CEO at Flokzu Cloud BPM Suite
Comment
  1. more than a month ago
  2. BPM Discussions
  3. # 7
Accepted Answer Pending Moderation
There are still a lot of issues to tidy up before certain critical enterprise apps can find their way into the cloud. First, let's face it: your bandwidth to the cloud is less than your bandwidth in your datacenter. If you're transferring TBs of data between systems of record and engagement, you probably want a pretty fat pipe running between them.

Second, despite significant advances, it's not clear to me that federated identity and authentication schemes work well at the enterprise level for use cases demanding diverse and distinct cloud-based services.

And of course there are other considerations. Data stored in the cloud needs to be secured. It may be a struggle to reliably and consistently gather and update data that's scattered across disparate cloud services using a variety of encryption mechanisms and (as noted) authN/authZ schemes. Companies accustomed to relying on perimeter defenses will find themselves in a whole new world of permeable and semi-permeable service layers.

OK so none of this is new. But I enjoyed thinking about it, so... thanks Pete!
http://www.bplogix.com/images/icon-x-medium.png
-Scott
Comment
Oh wait, sorry—was I supposed to include a commercial message? :P
Scott you have just articulated my concerns as I plan new systems for global project. I see processes and data need to be owned by organisation yet I do not want to manage and own the infrastructure and all the technical issues. I see a "private" cloud as future contracting with global supplier to supply such infrastructure focused only on our needs but with open door to switch if necessary over longer term; avoids lock in and keeps supplier on toes...?
  1. more than a month ago
  2. BPM Discussions
  3. # 8
Accepted Answer Pending Moderation
In terms of cloud process adoption, the market is split exactly in two: those who have done it and those who will do it.

The critical mass to adoption will be provided by heavyweight ERPs defaulting to cloud infrastructures and also migrating their existing on-premise deployments to cloud (SAP, anyone?).

Yes, as per E Scott's input, there are unresolved issues in hybrid deployments of digital business platforms, that will evolve far beyond the classic vendor-led scenario of "System-of-Record on-premise communicates with System-of-Engagement in cloud". But they will get ironed out as critical mass is reached.

The biggest problem for SaaS companies is that the big PaaS players (AWS, Azure, GCE) they are hosted on right now may increasingly wall off their infrastructure or outright compete with their SaaS customers, shaking them like dead flies.
CEO, Co-founder, Profluo
Comment
All these issues are about distributed-ownership application architecture (or application as a system-of-systems). Do you mean that such an architecture is the future of processes (and BM-suites)?
Alex, I think it's pretty straightforward that the big PaaS players will start complementing their offerings with SaaS and business apps. I think that's the next battleground for the dominant cloud players (at least Amazon and Microsoft, because Google seems to lag significantly in enterprise)
  1. more than a month ago
  2. BPM Discussions
  3. # 9
Accepted Answer Pending Moderation
Digital lead by BPM thinking may be the final step to commoditization of enterprise software. This will be enabled by the no code break throughs which will dramatically cut costs and time to deliver. It will also deliver future proof investment with change readily supported. I think this could significantly affect the future of the traditional "cloud" delivery where organisation want ownership of their processes which will be unique to them. I can see merit in organisations having their own private cloud sub contracting their infrastructure requirements. If this is right we could see the demise of SaaS as lease purchase of software becomes a realistic basis for investment in processes as assets owning both data and the software driving the processes.

Note for operational processes to remain assets they must be capable of ready change otherwise may become a liability hence need for business to have complete control over their operational software. This applies to what could be termed salient process i.e. those which are important to the function of any business. Non salient processes which could include bulk processing such as employee pay could be safely "outsourced" to an external cloud....something that was being done mid 90s...! In summary I would suggest common sense suggests do not be lead by cloud thinking work out the best way to have control over your processes.
Comment
  1. more than a month ago
  2. BPM Discussions
  3. # 10
Accepted Answer Pending Moderation
It's not hard to find analyses of so-called "cloud" that define cloud as "just another way of deploying software". All the issues of BPM and microservices and APIs etc. are still issues regardless of deployment. To cloud-or-not-to-cloud is thus a use case and a business case concerning access to the evolving software functionality one needs to succeed in business. And of course part of the consideration is very much trust, uncertainty and risk. So the question about processes and cloud is a question especially about infrastructure risk. And infrastructure as such is very much about scaleability, manageability, latency etc. -- and again trust, uncertainty and risk. Is there anything inherently "process" about cloud infrastructure? In other words is there anything inherently "process business semantics-related" concerning cloud deployment versus on prem deployment? I don't think so -- they are orthogonal questions to each other. Do we care about this distinction? Maybe -- because cloud-mania can lead to cloud-for-the-sake-of-cloud -- which inevitably means that in the planning process process semantics (or any software business semantics) get short-changed. Poor business semantics is a different kind of risk.
Comment
The big risk with "cloud" is you let your IT staff go, let your hardware go, let your network go, on the basis of an ROI that promises "one low monthly payment".

You're now on the service supplier's "slippery slope" - who can predict what the terms and conditions will be for a new agreement once your "cloud" agreement runs out?

Watching the buildup of "cloud-mania" is highly entertaining - we used to have "cloud" back in the 1960's - it was called "timesharing".

There were charges for connect time, CPU, characters in/out, and storage.
Karl you make good point about the "slippery slope" but if you have a true platform software which handles the digital front and back office orchestration I see that risk is eliminated thus allowing ease of transfer of the process oriented software and data to new delivery/infrastructure supplier. Once suppliers realise this is a reality the game changes making subcontracting infrastructure a practical option paying as you go. Legacy mess as ever is an issue but overtime this new approach will allow to be significantly reduced.
  1. John Morris
  2. 1 year ago
  3. #4427
"De-skilling" and/or "supplier risk" could truly be a risk of cloud (per @Walter). However one can make an argument that the kinds of skills and capital that cloud replaces are increasinlgy useless drags on profitability anyway. Insofar as "cloud services" are a dynamic marketplace, competition will minimize supplier hostage taking. The trend is therefore to IT expenses and management brainpower applied only to organizing business semantic technology. That's a risk worth taking.
OK, but look at what happens (cellular phone plans) with market consolidation, where the survivors "circle the wagons" and try to out-innovate each other in the area of customer gouging strategies.


@David.

No worries re transfer of the process oriented software (the corporation doing the outsourcing typically owns that software and has and maintains the source).

However, there have been instances, where, given a dispute, the supplier puts a "lien" on the corporation's data.

The only way I would use any "cloud" service is with a trickle data extraction to a totally independent cloud hosting service (costly).
  1. more than a month ago
  2. BPM Discussions
  3. # 11
  • Page :
  • 1


There are no replies made for this post yet.
However, you are not allowed to reply to this post.