
Today, I am in a short (two days) vacation, due to the Purim holiday... Purim is a fun holiday for children in which they wear customs, here is my daughter Hadas, posing as a hippie (something she knows only from old movies. The fun takes me to a recent Blog posting of Hans Gilde who wrote about two false statements and a fun fact about CEP. So I decided to have my own version talking about three misconceptions. Not really new stuff, I have written about it before, but putting it all in one posting. I'll concentrate on three misconceptions: the elephant's view misconception, the complex about complex misconception, and the maturity misconception.


David Luckham has coined the term "complex event processing"; he meant the processing of complex event, where a complex event is an abstraction of aggregation of other event that creates higher level event. However, the ambiguity of this sentence, made other people to interpret is as "complex processing of events", and somehow it became a name that various people are using it to denote ANY processing of events. There is even one person who declared a full fledged cyber war on anybody who uses the term "complex event processing" not in what he believes to be the original intent of past DARPA project in the area of security and military operational applications. My view is that complexity have various dimensions: in some cases the complexity is in the event structure, in other cases it stems from the uncertainty nature of this application (e.g. in fraud detection that patterns that are looked for is a constant moving target), another type of complexity is in the patterns themselves that are need to be supported, and in some cases, the complexity is not in the functionality, but in non-functional requirements such as: hard real-time constraints on latency, handling high throughput, need for deterministic performance etc... For example: network and system management are intended to find the root cause analysis of some events considered as "symptoms". The complexity is in attributing a collection of symptoms to actual problems, and the fact that the space of symptoms may be incomplete, thus stochastic models are needed; however, the pattern language needed is rather simple. On the other hand, in regulations enforcement, the patterns may be complex, but they are given, and there is no uncertainty aspect involved, and stochastic models would be of little use (they are used in risk assessment, but this is another opera).

In the insect life-cycle (see above picture) it is going from the phase of young larva to the phase of mature larva, but the way to adult insect is still (relatively) long. I always compare the state of event processing today to the state of databases in the hippies time -- late 1960-ies. Yes, there were some databases that worked, but the notions of concurrency control, transaction, distributed databases, query optimization to name a few, were not there; even the formal theory of the relational model has not been there; you can draw direct analogy to what is missing in event processing state-of-the-practice.
In the fun fact, Hans Gilde is talking about constructing the event processing discipline as a cohesive field, as the result of the six EPTS working group which are currently active. It is indeed fun work, however a huge challenge. It will take time (Rome was not built in one day either...). We need the best and the brightest to help in that, and hope that knowledgeable persons like Hans will also join and help in achieving this challenge. More - Later.