Thursday, August 30, 2007

Package Vendors: Free your functionality, free your processes

ebizq published a good whitepaper (, direct link to written by Tom Dwyer.

It's key points conform to my point of view, which, in a nutshell, is "free your functionality, free your processes".
Sadly, Tom's view at this point only advice package vendors to select a BPM vendor and implement it's technology, and not look further for an open standard solution for process technology.
Back to basics.
In many current software packages (COTS) that you buy, you will get great functionality and in some cases even automated process support (in the form of workflow and possible straight through processing capability).
We have come a long way from packages which stored their data in closed proprietary persistence layers. Now, many vendors agree with the adagium "Set your data free!". Many vendors have taken the step to become database neutral - e.g. you can stick any major database brand underneath these packages - Oracle, SQLserver, DB2 to MySql, etc. Great news for technology consilidation - vendors finally started to understand that customers did not like to keep thousands of databases running, based on many different database technologies (why? think high cost of licenses, knowledge of staff, lack of centralized monitoring, difficulties interfacing between databases, etc).
The next step they have started is publishing their data store structure or (better) have implemented decoupled logical views, so that we can access data in the application, directly through database calls or predefined API's. A good step but not enough...

The next step should be that vendors set their functionality and processes free.
:-) I can also see those nice pieces of functionality, and processes, sitting there, in their jails of closed code. Very sad indeed.
So free them? How? Like this...
Functionality not locked up, only to be used from user interfaces or very limited API's, but free, in the form of services. We can see a number of vendors that have been doing this, or are working hard on it now. It's basically opening up the business logic (and data) in the form of services, the SOA-fication of your functionality stack.
And the same for processes. First from a implementation perspective: What I mean is being able to put in ANY (standard) BPM engine. Away from proprietary process technology, towards standard BPEL engines, from for instance IBM, BEA, Appian, etc.
Again the business case is clear here: one BPM technology. In addition, the process technology many package vendors are offering, is very limited. No process modelling, but XML script hacking. No freely defined BAM functionality, but only what they provide.
Second is from a logical perspective: either being able to startup a process through a standard interface, or being able to simple reuse the process definition from the package, but place it as part of the total process. I still have thinking to do here, because how much influence do I want to give to a package, to determine part of my process view?

Where we want to go to, is this: if packages only support parts in the end-to-end processes in our companies, we do not want to interface. We want to create the overall process, and link it to services in all the apps we need for that process.

In the following diagram I show a typical example, that you will often encounter:

(click to enlarge)

From a IT perspective, this leads to:
extra technology, extra maintenance burden.

From a business perspective, if we look at this process from a LEAN point of view (, we see there is waste:
- Extra tasks, that do not add value (starting B, nightly batch, rekey activity)
- Time waste, waiting a night before we can continue to serve this customer

What we want is shown in the following diagram:

A key question that is hidden in Tom's article is this:
Can you, as a package vendor, still offer packages that:
- Are only task based (e.g. no process awareness)?
- Offer limited proprietary process technology?
- Which therefore result in difficult integration and suboptimal business and IT solutions?

I think not...

Sure, one last statement - we are not there yet. A recent IDC study also showed that BPM engines are mainly implemented on a project basis, not from an infrastructural basis. This worries me, because we will end up as customers with many many BPM pockets of technology, each requiring licences, knowledge to support, etc. A great role for Enterprise Architecture to help prevent this.... with some help from package vendors that is...

Tuesday, August 28, 2007

Additional thoughts on BPM architecture

James Taylor from EDM blog ( and Phil Ayres of Improving NAO posted a reaction Improving New Account Opening: Component stack - a simplified architecture for applications - Solving complex business problems with financial services technology to my earlier post on an architecture around BPM.

A number of thoughts:

First on "liked his architecture except that I think the Decision Platform he identifies also needs to be able to support both the CEP and BAM components. After all, determining which process to trigger, which action to take or when to inform someone may be a non-trivial business decision and so require decision management. I also think that a decision platform needs both rules and predictive analytics, as you might expect, and that it needs to be available to all the various components that might need decisions (typically implemented through decision services perhaps as a decision service hub as Neil and I discussed in Smart (Enough) Systems). "

Yes - I agree on the coupling between CEP and Decision platform. Good point - knowing what to correlate when, and what process needs to be (re)started could be a complex task. I can see scenario's where for instance correlated events might (or might not) start up a fraud research process.
It's actually interesting to see that many BPM platforms currently have no to only limited inter-process synchronisation. This could also be a great area where the CEP & decision services could help out. For instance: deciding that a certain running process should wait, until a new event is handled by a new separate process. Think of for instance updating a customers address, while at the same time doing a claim process. You don't want to end up sending stuff to the old address (that just burned down!).

On the link between CEP and BAM I am still a bit puzzled. Possible one could see certain BAM data, that points to a certain event, which needs a decision? Not sure. And maybe only usefull in quite advanced BPM maturity environments (which, unfortunately also is true for the notion of CEP and Decision Services - I am not seeing them yet in client implementations - which will in the future lead to additional maintenance...)

For me a decision platform definitely contains BRM technology. What I am still thinking of is - what if a certain decision "service" actually needs some human knowledge? Does it mean that from the presentation layer we will get some possibility of opening a "decision to-be-taken inbox"? Or will decision services actually lead to a normal task (decide on XYZ and let the process know?".

Another thing I am still wondering is the balance between putting decision logic in components, or centralizing it all in a decision services hub. Sure, centralizing it sounds like a estethic architecture thing to do. But the amount of overhead is large. For every IF statement, a component would have to go centrally. And why? So that we can maintain it easier? A trade-off is needed.
A presentation I saw from the Business Rules Platform in the Netherlands talked about this as well. They proposed a mixed model where a decision service hub/business rules hub could either:
- Work as a central service
- Distribute the logic towards the components.
- Do both, but synchronize the logic
Interesting to see which model will emerge.

Phil proposed "James suggests some enhancements to the architecture, largely to ensure that the Decision Platform can interact with the Complex Event Processing (CEP) and Business Activity Monitoring (BAM) components. I agree with his rationale, and would even take it a step further - every component in an architecture (that isn't pure technology) should be able to interact with every other, ensuring that really advanced business requirements can be considered, offering more and more business value."

Hm. Not sure that I agree. Sure, interoperatability is a great thing. And yes, in my picture, although not shown, most components will be able to contact other services.
However, what we should not forget is, again, the notion of coupling. Providing a integration platform does allow for flexible communication between components, or call it services. However, this does not take away the notion of coupling. When I, as a service A, am using service B, I do know about it! I have knowledge of the provided functionality and data, so that I can use it and add value to it, for a certain requirement. Knowing about it is nice, because then I can make use of it. But it also leads to dependencies - changing B will need carefull consideration.
The more connections, the more "services governance" you need.
In advance one cannot predict the business requirements, so true: you would want an architecture that supports easy connectivity. But the objective of my architecture drawing was also maintainability and manageability. Based on an (okay, not disclosed :-) set of requirements, you would want to find the architecture coupling that is on one hand supporting the requirements and provide flexibility towards change cases, but on the other hand limits the coupling/knowledge. I wanted to create an architecture where most components know eachother on a lean, need-to-know basis.

Gentlemen, thanks for your feedback and reactions welcome!

Sunday, August 26, 2007

BPM Suite as a component in a logical architecture

In a number of assignments in the past, I was involved in defining a high level logical component architecture for organizations handling administrative processes. My thinking around this continuously develops (learning from mistakes ;-), so I decided to take a snapshot.
I am always interested in additional views....and still puzzling with roles of each component (for instance: who will have the knowledge to show a certain taskscreen, interfacing with the core admin platform, based on a certain task from a certain process? And what will trigger the output management platform - an event or a service call from the BPM engine)

Here is a diagram of the logical architecture:
Three notes:
- Components are rectangulars
- Arrows are showing dependence (e.g. knowledge of), not information flow.
- All arrows would use the integration component (call it ESB or SOA grid or message broker, or someone picking up the phone...)

A breakdown, by component:

Input management - component (consisting of people, technology, processes) responsible for receiving messages from outside the scope of this system: external parties, such as clients, other service providers etc. This would be multi-channel, e.g. able to serve incoming documents by regular mail, phone, email, internet, etc.

CEP - Complex Event Processing platform - component that is able to detect and interprete an incoming message (or other internal event), and, using business rules, translate it to an action. Most times this either means triggering a BPM engine to start a new process, or, in the event of a correlation, to restart/continue an existing process. Of course, the CEP Platform has its own datastore for events (logging) and provide services to ask what happened when, and why did it result in a certain action.
ECM Solution - used for storing all non-structured data, such as emails, documents, faxes, etc. It might also be used to store structured data (incoming XML), to have a complete overview of all incoming messages (incoming records management). It of course contains all index/meta data, and provide services to lookup, view, alter content.
BPM Engine - used for process coordination. Based on process templates, it can execute steps in a proces, including logistic decisions (if's, supported by decision or BRE platform). It can handle automated tasks (e.g. calling services), and it can handle manual tasks (assigning a task to a certain user/usergroup), using information from the workforce management system. It records various process execution statistics for business activity monitoring.

BAM Solution - used for process monitoring. Is able to analyse data in the BPM engine, and present it. Could have all types of additional services, around KPI calculation, SLA monitoring and alerts.
CRM system - used for customer information. Gives a complete oversight of the customer, in terms of profile, current products, processes/services running and done. Decoupled of core administration system (since you could have multiple core admin systems, but only one customer!).
Workforce system - used to supply BPM engine with appropriate information to be able to send tasks to the right user or usergroup (via pull or push). This could range from simple static user/usergroup assignment, to complex rule-driven skill/availability/workload driven assignments.
Core administration platform - this would be the core system for supporting the business of this organization. For instance: account administration for banking, insurance handling, claim handling, stock market transaction system, etc. The system allows functionality and data to be accessed through services, or through context driven screens (triggered by a certain task in a process).
PLM Platform (product life cycle management, or product configurator). A platform which allows for quick adaption of products, supported by the core platform. It supports definition of products, product components, data definitions, rules and calculations.
The Decision Platform has the responsibility to make decisions (or advice them), based on data and rules. This could be fully automated, or a more manual supporting service (think traffic light signaling, for instance, frequently done in credit scoring, where a employee can still decide to accept a loan-request, even though the signal is orange or red).
Security platform. Users, roles, etc. Should be implemented centrally, otherwise you end up with various platforms (usually as part of the other components), with less strength and much more synchronisation effort.
Presentation technology - this component is able to enable various users to interact with the components, and view data. Think of the regular inbox and task windows, document viewing, but also event monitoring, BAM and system admin.
Outputmanagement - this component is responsible for communication outward. Based on events (or a called service from the BPM engine, not sure yet!), it knows, using output rules, what to communicate, to whom, through what channel (paper, email, website, etc). It is able to format a message (email, pdf, Word file, etc) and send it directly or to a printing facility. Of course, output is also send to the ECM solution for storage.

Sunday, August 05, 2007

A process model is not enough...

A number of years ago, I worked for a system integrator, with a a manager that was, well, a bit limited in his overview of the IT best practices.
One day he came up with a great idea. Recently visiting a seminar, he had learned that UML was the answer for everything. And he had a solution to the workfloor's problem: unclear requirements and an expensive process to define them. His solution: use case diagrams. His expectations: we could draw these in 1 hour, and voila - we had our requirements - quickly, clear, and cheap!
It took a while to make him understand that a picture, with some actors, circles and words, where a great idea to structure requirements, but by now means, a use case diagram would cover the requirements in enough detail. So reluctantly he allowed us to continue to analyse and write good software requirement specs.... :-)

It seems that we seem to make the same mistake now with process models. In a number of situations now, a business person delivered a set of process models, and said "well, here are our processes, have fun analyzing them and supporting them with good IT solutions".


So for once and for all:
When people need to communicate about process - a lot more information is needed.
My checklist:
- The goal of the process, how does the goal relate to the strategic intent of the company
- The criticality of this process
- Compliance, legal and other regulation requirements for this process
- The involved stakeholders (internal, external)
- The power structure/organization structure of these stakeholders
- The trigger that starts the process + through what channels
- The process owner
- The place of this process in the process architecture
- Input to the process
- Output of the process (including exceptions)
- The central object of the process (document, case, transaction, claim, client request, etc) that flows through the steps
- Events during the process, and typical + exceptional sequences and timings
- Data involved through the process
> Logistics (who does what when)
> Execution (data needed to perform task, data produced by task)
> Management (measures)
- Business rules for process:
> Decisions on flow (what task now.... if XXX then do YYY)
> Decisions on assignment (task XXX will be done by YYYYexcept if AAAA then performed by BBBB)
- The critical decisions, rules or patterns associated with it, and required authority to take them
- Succes - when is execution of this process considered a success? What performance criteria?
- CSF's - what is needed to reach this performance? and KPI's
- History of the process - when was it created, how has it performed, when and why has it broken down, what are typical issues, when and why performed it exceptionally well

Well, that's a lot more than a process model. It's the process context that BPM specialists need.

Starting with BPM-Suites - but what to do with presentation layer

It was nice to see some of my remarks on my previous post (BPM and packages) confirmed in a good state of the BPMS market item on
"The true upside opportunity for BPM is to evolve into a platform that supports rapid application development, change, and integration with a visual model-centric paradigm that represents a clear advantage over previous application development approaches."

E.g. A BPM-Suite currently is nothing more that a new custom development tool....

I want to address another limitation, that causes confusion.
Many BPM-Suite vendors proudly say: well, use our BPM-Suite, model your processes, link to your services, and voila, you have your SOA enable, process aware, WFM portal..... And, even better: you re-use your legacy.

I doubt it.
On a recent project, a client wanted to use a BPM Solution, integrated with their current core application. BPM vendor comes in, and says: shield your application, by wrapping services around it.

What this means:
About 65 screens, that users are used to, and that work fine, should be replaced by 1. Services, en 2. Well, screens again, but now in a Portal component of the BPM-Suite.

Added value? Good question...
- Rebuilding screens
- Rebuilding screen based business rules/validations

I am a bit puzzled. How can we deliver the power of BPM technology for human centric workflow, in the reality of existing applications....

On my wish list: a BPM vendor that comes in, and says...
Oh sure, we have a tool that analyzes all screens in your application, analyzes all user interactions over the last X years, and then generate a portal with optimized screens. With some minimal work, if needed, you can optimize these screens where needed. Oh, and the generated screens use common techology X, conforming to standards Y.

Then... I see BPM as a new tool, that can take over and integrate various packages and custom built apps, and integrate them from a process perspective.

Wednesday, August 01, 2007

Key challenge for BPM-technology: integration with packages

I have now seen two major projects that decided not to go for a separate BPM technology layer.
Both had done a large package selection (both in the insurance market). And it turned out (surprise) that these packages had their own, simple but workable, workflow/orchestration technology component.

I participated in both projects during feasibility studies, aiming to analyse the possibility to integrate the package with the company's target BPM platform.

And in both situations, we had to conclude: yes it is possible, but it's complex, risky and leads to a lot of extra effort.

Hm. This is not the way that BPM Technology is going to be the next killer app....

Some facts
1. These package vendors build their own "BPM technology" type components, but with deep integration with their package. Make's sense from a vendor perspective (for the short run), but is a nightmare for integration and enterprise architecture goals for typical companies (that would like to limit different types of technology used for process coordination)
However, their defense is also: there is no consensus on what BPM technology should provide, and what standards it should follow. Result of course is a proprietary process language, and service integration platform.
As a result, it is very difficult to buy the package, but replace the BPM component with your own central platform.
So we end up with another process component, and extensive integration (back to EAI) with other software, that needs to play a role during the end-to-end processes (so we lose overall process views and coordination).

2. User interface issues around packages and human workflow components of your BPM platform is another pain. Typical requirements are:
- Inbox for tasks
- Ability to open, claim, pauze, intermediate save, finish task
- Context aware task windows (e.g. after claiming a task, open the right dialog and windows in the package
- Integrate with document management, showing for a task the related documents.

As there are no clear standards and answers in these area's, clients have the possibility to let the package do this (usually speedy, but limited and proprietary) or try to find a solution with BPM technology and some presentation layer, integrating BPM, package and ECM solution.
Unfortunately, this is for the most part hard coding, deep coupling web client, server and various other technology layers.
So we end with, again, package solution, proprietary presentation layer, including WFM user interface.

Question - Can we solve this?

Well - there will always be package vendors that provide their package with a proprietary BPM component. Just to be able to support client requirements, one stop shopping and to avoid being dependent on third party technology.

But I think from a customer perspective (mainly technology), customers and their IT architecture can require packages to be more open to integration with third-party BPM tools. Definitely a good advice.

And most vendors might not have an issue either - it's sometimes better to focus on the package functionality, and leave general functionality to others (see what happened to identity management/LDAP, database, app server, messaging). BPM technology is the next candidate for commodity infrastructural software.

However, the BPM vendors need to do something too - being able to standardize even more. Sure, the WFMC model is a start, and some standards are there. But solve the above two issues 9with minimal development required and customers can force package vendors to open up to process (as a next step, after opening up for data and services).

And then BPM technology has a chance to go more mainstream. Otherwise we will always view BPM technology as yet another 4GL, in which we can develop visually (but we still call it development...)