- phase I -- simple event processing: determine what events are important, how they can be obtained, who should receive these events, how these event should be transmitted etc.. This in some cases is a major effort, teaching the customer to "think in events", and building the way to emit and consume events, in some cases that by itself can be a big investment, since it can affect running systems, and invest in development of instrumentation/adapters/periodic pull mechanisms. The type of "event processing" that is being provided by basic EDA is typically routing (including subscriptions) and filtering. This phase should have a business benefit by its own right to justify the investment.
- phase II - mediated event processing: to bridge semantic and syntactic gaps between the event producer and consumer, there are additional mediators that provide ESB type functionality: transformation (including split and aggregation), enrichment (from external sources) and validation - this is often needed, and since many of these features are pretty much common, it may make sense to use COTS for it, although in simple cases, hard-coding may be cost-effective enough, this phase is optional, it may make sense to unify it with phase III.
- phase III - complex event processing -- this is a step-function towards looking at "event clouds" instead of events one-by-one, and typically built on top of the phase I, possibly I +II. I have seen cases in which phase III has been implemented as first step and included step I and II -- this makes step III much longer, but makes sense if the customer has a business problem for which it needs step III, but really did not think in event before.
- phase IV (optional) -- This extends the functionality towards intelligent event processing - which I already discussed in the past.
More about maturity models and event processing - later.