Showing posts with label maturity. Show all posts
Showing posts with label maturity. Show all posts

Wednesday, March 11, 2009

On misconceptions and fun in event processing


Today, I am in a short (two days) vacation, due to the Purim holiday... Purim is a fun holiday for children in which they wear customs, here is my daughter Hadas, posing as a hippie (something she knows only from old movies. The fun takes me to a recent Blog posting of Hans Gilde who wrote about two false statements and a fun fact about CEP. So I decided to have my own version talking about three misconceptions. Not really new stuff, I have written about it before, but putting it all in one posting. I'll concentrate on three misconceptions: the elephant's view misconception, the complex about complex misconception, and the maturity misconception.




I have used this picture before, about several blind people touching various parts of an elephant and reaching to different conclusions what an elephant is. The first misconception is that there is a single reason, a single industry, a single killer application, and a single view on what event processing is; some people strongly identify event processing with certain trading applications in capital markets; the financial market industry has been indeed an early adopter, but other industries (gaming, chemical and petroleum, travel and transportation, retail, healtchcare, insurance and more) are getting there, and we see a wide range of industries and applications, bringing to a variety of requirements. I don't foresee that in the future there will be any single dominant industry. Some people think that the only ROI for event processing is the support of high throughput, but most EP applications today don't really require high throughput support; actually in some cases the high throughput support is indeed the ROI, in others it is the TCO of using higher level abstractions or the use of adapters. Some other voice are saying that the only important requirements are those relevant to some security applications. Well - this is true for security consultants who make their living from these applications -- but not in general. We should strive to have a bird's-eye holistic view of the entire elephant.




David Luckham has coined the term "complex event processing"; he meant the processing of complex event, where a complex event is an abstraction of aggregation of other event that creates higher level event. However, the ambiguity of this sentence, made other people to interpret is as "complex processing of events", and somehow it became a name that various people are using it to denote ANY processing of events. There is even one person who declared a full fledged cyber war on anybody who uses the term "complex event processing" not in what he believes to be the original intent of past DARPA project in the area of security and military operational applications. My view is that complexity have various dimensions: in some cases the complexity is in the event structure, in other cases it stems from the uncertainty nature of this application (e.g. in fraud detection that patterns that are looked for is a constant moving target), another type of complexity is in the patterns themselves that are need to be supported, and in some cases, the complexity is not in the functionality, but in non-functional requirements such as: hard real-time constraints on latency, handling high throughput, need for deterministic performance etc... For example: network and system management are intended to find the root cause analysis of some events considered as "symptoms". The complexity is in attributing a collection of symptoms to actual problems, and the fact that the space of symptoms may be incomplete, thus stochastic models are needed; however, the pattern language needed is rather simple. On the other hand, in regulations enforcement, the patterns may be complex, but they are given, and there is no uncertainty aspect involved, and stochastic models would be of little use (they are used in risk assessment, but this is another opera).
The third misconception is the maturity one. Some vendors claim that they have mature technology and they mean that their product is stable, and being used by customers without substantial problems. This does not say that the event processing discipline by itself is mature.
In the insect life-cycle (see above picture) it is going from the phase of young larva to the phase of mature larva, but the way to adult insect is still (relatively) long. I always compare the state of event processing today to the state of databases in the hippies time -- late 1960-ies. Yes, there were some databases that worked, but the notions of concurrency control, transaction, distributed databases, query optimization to name a few, were not there; even the formal theory of the relational model has not been there; you can draw direct analogy to what is missing in event processing state-of-the-practice.

In the fun fact, Hans Gilde is talking about constructing the event processing discipline as a cohesive field, as the result of the six EPTS working group which are currently active. It is indeed fun work, however a huge challenge. It will take time (Rome was not built in one day either...). We need the best and the brightest to help in that, and hope that knowledgeable persons like Hans will also join and help in achieving this challenge. More - Later.

Saturday, November 1, 2008

On the head and the tail of EDA


Somehow the discussion around "EDA" and "CEP" continues in the Blogsphere - Giles Nilson from Apama has published seven points , out of which I quite agree to the first five. Jack van Hoof, who started this whole thread of discussion, argues that "CEP is not the beginning but finishing of EDA". So what is the answer? head or tail ? the right answer is --- it depends. Some customers work with methodic architecture way, in which first the EDA architecture is being set up, and then CEP tools are invoked on top of this architecture -- this is the case that Jack is talking about. However, in some cases, customers don't apply architectural thinking, but just acquiring some application that is implemented with CEP tool, and thus it introduces a kind of EDA ad-hoc for a specific application, this is the case that Jiles is talking about.

My guess is that most early adopters have applied CEP in ad-hoc way, so it serves as a "head" for more EDA in the future, while recently we see more customers looking at EDA first, since they are taking it from enterprise architecture perspective.

If we'll take the CEP functionality -- finding patterns on multiple events, it may not be implemented on EDA at all -- since the same technique may fit to "non event" environment.

More - Later.

Saturday, May 31, 2008

On Maturity



After a week break -- lot of work, some taking care of family health issues - I am back catching up on the community Blogs. One thing that I saw has been a debate whether CEP is not mature enough. I think that both sides have a point, depending on what "maturity" means. I'll make a distinction between: maturity of an application, maturity of a product, and maturity of a technical area.


Maturity of an application is measured in the fact that an application is working with relatively low amount of faults, and satisfy its functional and non-functional goals. Typically there is some time for stabilizing the application followed by a maturity time, and in some point the application becomes an obsolete in something, which requires to initiate the next generation.

Maturity of a software product is materialized in the fact that the product is working, being used in applications, and has relative few bugs, and customers trust it to rely on it.

Maturity of a technical area has to do with the level of understanding of this area by customers, amount of utilization of the area relative to potential, clear concepts and standard support.

In the CEP area we certainly have mature applications, we also have some maturity in the products of the first generation, but we are somewhat far from the maturity of the entire area.