This picture shows static and dynamic flows in an interesting picture I have found in one of the Web albums under "Minneapolis pictures".
In continuation to my previous posting on event flow and decoupling, I would like to discuss the issue of static vs. dynamic event flows.
I already discussed the fact that event processing applications can be of many types, and naturally various types have their own properties.
There are applications whose nature is totally dynamic, such an application is information dissemination to alerts about customer's activities in banking systems. There are many subscribers that can subscribe to multiple type of alerts and change their subscriptions from time to time. In these type of application monitoring the event flow can be done for management purposes of the system, e.g. collection statistics about patterns of use, tracing individual flows for exception handling purposes etc.. However there is no sense of a global event processing network as there are many flow islands that are not related.
On the other hand there are event processing applications in which the flows are relatively static and there are a relatively stable set of event processing agents with relatively stable collection of relationships among them, actually, many of the event processing applications I have encountered are of this type. Example: an event processing application that manages am auction. The flow here is fixed as long as the auction protocol is not changing, thus the collection of event processing agents and their relationships are fixed. Of course, the run-time instances are still dynamic. This is similar to a database schema that may be relatively stable, and the data itself is dynamic.
The flow modeling is helpful for the:
- "software engineering" aspect --- debugging, validation,analysis
- performance aspect --- enable of scale-out by semantic partition, a topic we are working on and I'll discuss in detail in one of the future postings
- management aspect --- provenance, tracing, monitoring
There are more questions that needs discussion about dynamic updates to event processing network, and I'll discuss them in the near future -- more later.
This is a simulation of an anesthesia workstation, it can simulates various cases that creates a flow of events that are flowing inside this configuration, e.g. what happens when there is power failure.
I was recently asked if there is no contradiction between two concepts:
- The decoupling concept, each event processing agent is independent, it subscribes to some event, publishes derived events, and is independent in any other agent, furthermore, it is decoupled and does not know anything about the other agents.
- The Event Flow concept, in which there is explicit modeling of event flows
My answer is that there is not really a contradiction, since these two principles are in two different levels. The decoupling is in the execution level, event processing agents indeed do not need to communicate with one another, since there is no RPC or any other synchronous communication among them. The event flow concept exists in the modeling and management layers. In the modeling layer, there should be a view of the entire "event processing network" to enable that that the orchestra is playing together; in the management level, there should be a possibility to trace back the provenance of a certain decision or action, or trace forward the consequences of any event, however this still does not require to violate the decoupling in the execution layer, that's the beauty of model driven architecture... more - later.
In my previous posting about the "quantum leap" question I have mentioned that one of the major areas in which innovation is needed to advance the state of the art is the usability area.In current products the user interfaces are going from the "implementation up" which mean that the primitives presented to the developer are closely related to the implementation, there are some user interfaces that are going towards the concept of the business user as developer , however, it is still trying to make it easy to work with the existing primitives that are based on the implementation. The contrary approach is to start from a model that is closely related to the business user concepts and map it down to the implementation. The agility requirement is to give the business user the ability to author, modify and control the behavior of and event processing system, or even more generally, a multi technology decision system. The engineering part here enables to translate it into an implementation that will maintain the desired quality of service. The modeling should start from the goal of what the system wishes to achieve -- decision, monitoring, prediction and some others (see my posting about type of event processing applications)James Taylor is often writing about the MDE idea in the context of decisions however decisions is just one type of application and this principle apply for other types of functionality. This is a major challenge, and I'll say more about this direction in subsequent postings.