Showing posts with label misconceptions. Show all posts
Showing posts with label misconceptions. Show all posts

Wednesday, December 23, 2009

On common misconceptions about event processing - the complexity misconception



David Luckham has coined the term "complex event processing", this term has caught as the marketing term behind many of the vendors that provide event processing platforms (comment: IBM, and recently Progress/Apama moved to use the term "business event processing"). While this term succeeded to get traction, it also is a source of on of the common misconceptions, Luckham talked about complex events, and their processing, some people understand it as the complex processing of events, and some just view it as the intersection between event processing and "complex systems". Complex event is defined as "abstraction of one or more other events", which also leads to some interpretations about the nature of abstractions, so this interpretation is easier to understand. However, the misconception is that it is more intuitive to think about "complex event processing" in the second interpretation as "complex" processing of events, and this brings us to the question -- what is complexity? there can be different dimensions of complexity.

  1. The complexity may stem from the fact that we don't know what exactly we are looking for, and generally looking for anomalies in the system (e.g. trying to find security violations), thus some AI techniques have to be applied here.
  2. Another case is that we know what are the patterns or aggregations we are using, but they require complex computing, in term of functional capabilities.
  3. Another case is that the complexity is in some non-functional requirement: such as scalability in several direction (scale-up or scale-down), strict real-time performance constraints, highly distributed system etc..
  4. Another case of complexity is in interoperability, the need to obtain events from many producers, and use events in many consumers, which requires instrumentation/modification of a lot of legacy systems.
  5. Yet another case of complexity may be unreliable event sources, handling false positives and false negatives.

There are probably more complexity cases, however the interesting question is whether the main goal of an event processing system is to solve a "complex" problem.

Om my scientist hat, it is definitely more exciting to solve complex problems, even better, problems of the type that have never seen before. However, from pragmatic point of view, event processing applications are measured on their business value, and there might be a lot of business value of using event processing to systems that have none of these complexity measures, from complexity point of view they can be quite simple, moreover, there may not be an exciting aspect about the implementation as it is similar to other implementations already done, but on the measurement of "business value" it brings a lot of value, thus the value metric is orthogonal to any complexity metric, and indeed many of the applications in which event processing technology is very useful to is quite simple (according to one of the analysts report the "simple" applications are 80-90% of the potential market for the event processing technology). While there is certainly a segment of application for each type of complexity, and more work is required in these direction, the "simple" application will be the bread and butter.

More misconceptions - later.

Sunday, December 20, 2009

On common misconceptions about event processing - the single application misconception

We start the introduction chapter for the EPIA book, by stating: Some people say that event processing is the next big thing; some people say that event processing is old hat and there is nothing really new in it. Both groups may be right to a certain extent. As with any field that is relatively new there is some fog around it: some of the fog stems from misconceptions, some from confusing messages by vendors and analysts, and some arises because of a lack of standards, a lack of agreement on terms, and a lack of understanding about some of the basic issues.


In the book we don't really talk about the misconceptions, but I think it is a good topic towards the end of 2009 to dedicate some postings towards the major misconceptions.

I'll start with misconception number 1: Event processing is a single-industry (some even say single-application) technology, and event processing software cannot generalize beyond this single industry/application.

The industry is, of course, capital markets, and the application is algorithmic trading

The diagram below is taken from the ebizQ customers survey (two years ago) about what are the business problems that they expect to solve with event processing, and the result is 9% indicated algorithmic trading.






This misconception is originated from the fact that the capital market industry has indeed been the early adopter of event processing software, and served as a proof of concept for the rest of the industries, there are indeed some vendors that focus mainly around this type of application, however, this does not show the entire picture. From the IBM experience I know of customers in various industries, most are not in the capital market area. Getting to the material collected by the EPTS use case work group (the material is on the EPTS members internal site, available to EPTS members) I find quite a lot of examples of systems working in production or being developed from variety of domains, here are some samples:
  • Border security radiation detection (Eventzero)
  • Mobile asset geofence (Rulecore)
  • Logistic and scheduling application (Starview)
  • Unauthorized use of heavy machinery (Rulecore)
  • Hospital patient and asset tracking (IBM)
  • Activity monitoring for taxing and fraud detection (IBM)
  • Intelligent CRM in banking (TIBCO)
  • EDA and asynchronous BPM in retail (TIBCO)
  • Situation awareness in energy utilities (TIBCO)
  • Situation awareness in airlines (TIBCO)
  • Reduce cost in injection therapy (IBM)
  • Next generation navigation (CITT)
  • Real-time management of hazardous materials (Oracle)
  • Finding anomalies in point of sales in retail stores (CA)
  • Elderly behavior monitoring (U. of Munich)
These are only samples, I am familiar with variety of examples in various industries: healthcare, utilities, chemical and petroleum, insurance, security, transportation and others. In the last event processing symposium in Trento, we had one keynote address on event processing in robotics, and there are other areas as well such as smart house to monitor energy consumption in the house. While we are now in the first generation, and the utilization of event processing will increase in time, the coverage in terms of industries and applications is growing has gone far beyond algorithmic trading.

More misconceptions in subsequent postings.

Wednesday, March 11, 2009

On misconceptions and fun in event processing


Today, I am in a short (two days) vacation, due to the Purim holiday... Purim is a fun holiday for children in which they wear customs, here is my daughter Hadas, posing as a hippie (something she knows only from old movies. The fun takes me to a recent Blog posting of Hans Gilde who wrote about two false statements and a fun fact about CEP. So I decided to have my own version talking about three misconceptions. Not really new stuff, I have written about it before, but putting it all in one posting. I'll concentrate on three misconceptions: the elephant's view misconception, the complex about complex misconception, and the maturity misconception.




I have used this picture before, about several blind people touching various parts of an elephant and reaching to different conclusions what an elephant is. The first misconception is that there is a single reason, a single industry, a single killer application, and a single view on what event processing is; some people strongly identify event processing with certain trading applications in capital markets; the financial market industry has been indeed an early adopter, but other industries (gaming, chemical and petroleum, travel and transportation, retail, healtchcare, insurance and more) are getting there, and we see a wide range of industries and applications, bringing to a variety of requirements. I don't foresee that in the future there will be any single dominant industry. Some people think that the only ROI for event processing is the support of high throughput, but most EP applications today don't really require high throughput support; actually in some cases the high throughput support is indeed the ROI, in others it is the TCO of using higher level abstractions or the use of adapters. Some other voice are saying that the only important requirements are those relevant to some security applications. Well - this is true for security consultants who make their living from these applications -- but not in general. We should strive to have a bird's-eye holistic view of the entire elephant.




David Luckham has coined the term "complex event processing"; he meant the processing of complex event, where a complex event is an abstraction of aggregation of other event that creates higher level event. However, the ambiguity of this sentence, made other people to interpret is as "complex processing of events", and somehow it became a name that various people are using it to denote ANY processing of events. There is even one person who declared a full fledged cyber war on anybody who uses the term "complex event processing" not in what he believes to be the original intent of past DARPA project in the area of security and military operational applications. My view is that complexity have various dimensions: in some cases the complexity is in the event structure, in other cases it stems from the uncertainty nature of this application (e.g. in fraud detection that patterns that are looked for is a constant moving target), another type of complexity is in the patterns themselves that are need to be supported, and in some cases, the complexity is not in the functionality, but in non-functional requirements such as: hard real-time constraints on latency, handling high throughput, need for deterministic performance etc... For example: network and system management are intended to find the root cause analysis of some events considered as "symptoms". The complexity is in attributing a collection of symptoms to actual problems, and the fact that the space of symptoms may be incomplete, thus stochastic models are needed; however, the pattern language needed is rather simple. On the other hand, in regulations enforcement, the patterns may be complex, but they are given, and there is no uncertainty aspect involved, and stochastic models would be of little use (they are used in risk assessment, but this is another opera).
The third misconception is the maturity one. Some vendors claim that they have mature technology and they mean that their product is stable, and being used by customers without substantial problems. This does not say that the event processing discipline by itself is mature.
In the insect life-cycle (see above picture) it is going from the phase of young larva to the phase of mature larva, but the way to adult insect is still (relatively) long. I always compare the state of event processing today to the state of databases in the hippies time -- late 1960-ies. Yes, there were some databases that worked, but the notions of concurrency control, transaction, distributed databases, query optimization to name a few, were not there; even the formal theory of the relational model has not been there; you can draw direct analogy to what is missing in event processing state-of-the-practice.

In the fun fact, Hans Gilde is talking about constructing the event processing discipline as a cohesive field, as the result of the six EPTS working group which are currently active. It is indeed fun work, however a huge challenge. It will take time (Rome was not built in one day either...). We need the best and the brightest to help in that, and hope that knowledgeable persons like Hans will also join and help in achieving this challenge. More - Later.