Resolved
1 votes
Too big, and a business process has too many process owners, too small, and a process lacks impact. So what is the optimal size for a business process?
Thursday, November 07 2013, 09:48 AM
Share this post:
Responses (12)
  • Accepted Answer

    Thursday, November 07 2013, 10:03 AM - #Permalink
    Resolved
    1 votes
    What determines process size? - The amount of activities? - The amount of employees? - The amount of departments involved? - The MB's of data it produces? A process is a means, so the optimal size of a process is a useless thing to search for in my opinion. A process is a means to deliver a result. So if you can define a result that's worth delivering (you get paid for it, you want it your self, you have to deliver it by law) then it's worth a process that creates that result. And that might be one step, a million, who cares? Of course larger processes might be more difficult to manage. But as long as they are effective, who cares about size. So gurus who tell that a process should fit on an A4 paper (I think they mean a process model) or that a process should only have 5 +/- 2 sub processes? I don't understand them. 'Sorry we delivered you just half a useful product, We ran out of our sub processes' So the optimal size? Everything that is needed to be effective.
    Like
    2
    The reply is currently minimized Show
  • Accepted Answer

    Thursday, November 07 2013, 10:03 AM - #Permalink
    Resolved
    -1 votes
    What a strange question. The right size is whatever makes sense. A process can be 1 activity or 100 activities depending on who are involved and what is to be accomplished. The better question is what are the steps you need to take to eliminate waste in an as-is workflow to create a best practice to-be solution? In a nutshell, once you have mapped all activities required for the ideal path, identify which of those activities add value to the process from the perspective of the primary customer. All activities either (1) add value (2) enable value or (3) waste resources. An activity is said to be value-add if it improves the informational quality, moves the file toward completion, and does it right the first time. If you are reaching backward into the process for something, it is likely not a value-add activity. Next, analyze the root cause for the non value-add activities (extra processing, transportation, motion, inventory, wait time, mistakes, utilization). With this understanding, start eliminating waste. So the right size for a process is one where there is no extra manual steps, no room for incorrect/incompletion errors, and no room for variation.
    Like
    1
    The reply is currently minimized Show
  • Accepted Answer

    Thursday, November 07 2013, 10:04 AM - #Permalink
    Resolved
    1 votes
    A better question is how do you decide how to segment business problems so that you end up with appropriate processes. It's not about the size of a single process ... it's about the set of procedural fragments that combine to deliver the goal.
    The reply is currently minimized Show
  • Accepted Answer

    Thursday, November 07 2013, 10:37 AM - #Permalink
    Resolved
    0 votes
    Small is Beautiful! Many business leaders get caught up in the desire to promote large, prestigious business improvement projects. Perhaps they can deliver enormous benefits, but they also carry huge risk. On the other hand we are all learning from lean and agile methodologies, while having the upstart tech startups show time and again that doing "just enough" is the most effective way to make a business successful. BPM can be made over complicated and projects can be large. But in my experience that is often due to attempting to model existing bloated business operations than having BPM actually help change the existing mess. Of course enterprise-wide, end-to-end business processes are large. But it is rare for any company to undertake that kind of change. So back in the real world BPM is typically departmental. As such it should be focused on keeping requirements and development small, reducing waste in the process overseen by a limited number of business owners, and cleaning up the handovers with other departments and interactions with customers. Short story: business process improvement should be as large as it needs to be to solve some immediate problems; as small as it can possibly be to avoid creating new problems.
    The reply is currently minimized Show
  • Accepted Answer

    Thursday, November 07 2013, 10:50 AM - #Permalink
    Resolved
    0 votes
    ... why do you need to measure size of the process? how would you use this metric/indicator/information further?
    The reply is currently minimized Show
  • Accepted Answer

    Thursday, November 07 2013, 11:02 AM - #Permalink
    Resolved
    0 votes
    I am not sure size alone necessarily has influence on scale of benefits. Typical web based HR modules expenses, holiday application and compassionate leave system will each have less than 20 tasks and associated links. All small but each delivering low overhead and transparency to improve efficiency of both HR and the employees. At the other end say a purchase order system, a mission critical healthcare module or a means testing application will be around 50 task types which I suggest be the maximum in any single process. Thereafter best to split down to address any supporting processes such as the enrolments and payment scheduling. A CRM will include all such processes and in total there may well be 2,500 tasks split into say 75 separate processes. There is no doubt collectively the effectiveness will be significant but to achieve does need that clear small modular focus of say 50 task and all connected to deliver on a bigger project as required. The size may be determined by the number of users working to achieve particular outcomes. All of course built with users input and no coders!
    The reply is currently minimized Show
  • Accepted Answer

    Thursday, November 07 2013, 02:08 PM - #Permalink
    Resolved
    0 votes
    The underlying question here is what to tackle first: the huge, high risk/high reward project; or the smaller, average risk/reward project? As always, the answer is: get an easy win if one is available. Establish an account rich with credibility and experience against which you will draw when your large project slows or stumbles, as they all do.
    The reply is currently minimized Show
  • Accepted Answer

    Friday, November 08 2013, 05:10 AM - #Permalink
    Resolved
    0 votes
    The process is as long as you can/want/should/must manage it from the point of the BPM discipline – model, automate, execute, control, measure and optimise. As Derek said, the problem is that process will be segmented as it spans over various boundaries. Process patterns, functional silos, organisations, enterprises, IT systems, etc. are those boundaries. How to assemble all segments into one "virtual" process? I think this issue should be addressed by #entarch. Thanks, AS
    The reply is currently minimized Show
  • Accepted Answer

    Sunday, November 10 2013, 09:42 AM - #Permalink
    Resolved
    0 votes
    I think the variation in the responses is indicative of the lack of a common framework for describing a process, which IMO is itself a symptom of the regrettable lack of formalism and discipline in the practice of process modeling. At the risk of adding more chaos to the mix, here is my take. The "right-sizing" of something should have some kind of external, objectifying framework by which the quality of "right-size" can be measured. If the process is long and complex but yields a reasonably consistent valued set of outcomes, then how is it not "right-sized" for the process owner or the process customer? Here is where the choice of that framework sees the spectrum of results, from good to bad, based almost entirely on how the framework defines it. In other words, "right-sized" is whatever my methodology says it is, and we can argue forever amongst ourselves as to who's framework for right-sizing is more right! For what it is worth, I like a framework that includes attention to fulfillment of associated goals, to conformance with governing architectures, and to the measurement of effeciency. Since positioning one framework's value over another is what drives the business model for practitioners and vendors alike, I think there is little-to-no hope of consensus here. The "right-sizing" of something should also have some kind of internal, objectifying framework by which the qaultiy of "right-size" can be measured. And here there is some hope for standardization. In SW design, we have principles of coupling and cohesion that are agnostic with respect to design methodologies and techniques - possibly because they deal with underlying "meta" concepts in SW design that can manifest in all methodologies and techniques. Together, these concepts define the "granularity" of the design construct, and there observable implications among different levels of granularity in SW design and in process design. While the explicit incorporation of these two concepts may sometimes be lacking, the implicit incorporation is often there because they are simply too fundamental to the art and science of SW design, so they are there in the background of the designer's thinking. Process design could learn a lot from this, and we should be able to accumulate data about designs that provide an empirical basis for design parameters, much the same way that SW design metrics (think something like function points) are used to estimate work on SW systems. There HAS been some work in this area (including my own), but the process modeling community needs to step up in acceptance of what it is doing as something serious and meaningful, worthy and in need of formalism, and not just drawing pictures with boxes and arrows. I mean, my young daughter can do that much. What we do should be more than that.
    The reply is currently minimized Show
  • Accepted Answer

    Monday, November 11 2013, 02:13 AM - #Permalink
    Resolved
    0 votes
    The size of a process...hmm. This gets into Fractal theory. What is the right size of chocolate, or a pile of leaves, or a Christmas tree? My definition of process is simple. Process is what works. If it doesn't work, it is not a process. Additionally, it may be working on a macro scale, a micro scale, or a medium scale.As in Fractal theory, all the scales can be operating at once. This is more of a parallel and analog question than a yes or no. So first, is it working? Second, can we peel back the layers and figure out which layers are working better than others? If so, we can start to optimize the layers.
    The reply is currently minimized Show
  • Accepted Answer

    Monday, November 11 2013, 05:12 PM - #Permalink
    Resolved
    0 votes
    Interesting angle on fractal. But I'm not quite prepared to accept that the granularity of process does not vary at or across levels, which for me means that granularity is not fractal in nature. However, I am prepared to accept that the evaluation of a process at one level is essentially similar at any level, which would be fractal in nature. More to the point, though, labels such as fractal are still too narrow, IMO. Consider a house. One could have a large house but have a small family. Outsiders can argue whether they have more house than they need, but if it is valued by those living in it, why is this not "right-sized" (at least) for them? Same with the exterior or type of house. If it is a split-level in a neighborhood of ranch style, it will look odd to those accustomed to looking at more homogeniety in neighborhoods. Or the use of siding vs. brick. In all of these cases, the external view (evaluation) of the house really depends upon the user-defined context. Same with a process. Consider also the effect of externalities. If houses in a neighborhood have a collective impact on the appraisal values of each house in that neighborhood, then a mcmansion (oversized house relative to others) will drive up valuations that over the long-term my lead to gentrification of the neighborhood. What about a segment of a process that is oversized relative to its peer segments - might surrounding process segments be similarly affected? Again, the answer turns on the user-defined context. However, whether a house is of sound design, with adequate support structures and superstructure, and meets buidling codes for the house, etc. are questions of the internal view (evaluation) of the house that donot have to turn on user-defined context. These are measurable, and certain design patterns prevail over time because they have demonstrable value in these regards. Same with process. It is here that I wish we would spend more time getting our craft righ-sized rather than argue on the merits of one's context over another. In all honesty, this is why much of the academic treatment of process in Europe and elsewhere in the world is simply more insightful than the marketing-driven obsessions we have here.
    The reply is currently minimized Show
  • Accepted Answer

    Tuesday, November 12 2013, 03:41 AM - #Permalink
    Resolved
    0 votes
    I must say I am puzzled at the "fractal" debate? There is only one layer that matters and that is the one reflecting all the business logic that supports people to achieve the required outcome. It is proven there are less that 13 such objects and with required linking will build any business process simple or complex. The model is the now the application as you see from this research http://www.igi-global.com/chapter/object-model-development-engineering/78620 This is the real custom process ready to be deployed using a declarative architecture. Makes "modelling" on it own a rather academic exercise?
    The reply is currently minimized Show
Your Reply

Join the Discussion

Want to join the discussion?

Login, or create an account to participate in the forum.

Top Participants

Dr Alexander Samarin
278 Replies
01/10/2016
David Chassels
271 Replies
01/10/2016
Emiel Kelly
223 Replies
30/09/2016
Bogdan Nafornita
210 Replies
30/09/2016
E Scott Menter
182 Replies
28/09/2016