Wednesday, December 23, 2009

On common misconceptions about event processing - the complexity misconception



David Luckham has coined the term "complex event processing", this term has caught as the marketing term behind many of the vendors that provide event processing platforms (comment: IBM, and recently Progress/Apama moved to use the term "business event processing"). While this term succeeded to get traction, it also is a source of on of the common misconceptions, Luckham talked about complex events, and their processing, some people understand it as the complex processing of events, and some just view it as the intersection between event processing and "complex systems". Complex event is defined as "abstraction of one or more other events", which also leads to some interpretations about the nature of abstractions, so this interpretation is easier to understand. However, the misconception is that it is more intuitive to think about "complex event processing" in the second interpretation as "complex" processing of events, and this brings us to the question -- what is complexity? there can be different dimensions of complexity.

  1. The complexity may stem from the fact that we don't know what exactly we are looking for, and generally looking for anomalies in the system (e.g. trying to find security violations), thus some AI techniques have to be applied here.
  2. Another case is that we know what are the patterns or aggregations we are using, but they require complex computing, in term of functional capabilities.
  3. Another case is that the complexity is in some non-functional requirement: such as scalability in several direction (scale-up or scale-down), strict real-time performance constraints, highly distributed system etc..
  4. Another case of complexity is in interoperability, the need to obtain events from many producers, and use events in many consumers, which requires instrumentation/modification of a lot of legacy systems.
  5. Yet another case of complexity may be unreliable event sources, handling false positives and false negatives.

There are probably more complexity cases, however the interesting question is whether the main goal of an event processing system is to solve a "complex" problem.

Om my scientist hat, it is definitely more exciting to solve complex problems, even better, problems of the type that have never seen before. However, from pragmatic point of view, event processing applications are measured on their business value, and there might be a lot of business value of using event processing to systems that have none of these complexity measures, from complexity point of view they can be quite simple, moreover, there may not be an exciting aspect about the implementation as it is similar to other implementations already done, but on the measurement of "business value" it brings a lot of value, thus the value metric is orthogonal to any complexity metric, and indeed many of the applications in which event processing technology is very useful to is quite simple (according to one of the analysts report the "simple" applications are 80-90% of the potential market for the event processing technology). While there is certainly a segment of application for each type of complexity, and more work is required in these direction, the "simple" application will be the bread and butter.

More misconceptions - later.

Sunday, December 20, 2009

On common misconceptions about event processing - the single application misconception

We start the introduction chapter for the EPIA book, by stating: Some people say that event processing is the next big thing; some people say that event processing is old hat and there is nothing really new in it. Both groups may be right to a certain extent. As with any field that is relatively new there is some fog around it: some of the fog stems from misconceptions, some from confusing messages by vendors and analysts, and some arises because of a lack of standards, a lack of agreement on terms, and a lack of understanding about some of the basic issues.


In the book we don't really talk about the misconceptions, but I think it is a good topic towards the end of 2009 to dedicate some postings towards the major misconceptions.

I'll start with misconception number 1: Event processing is a single-industry (some even say single-application) technology, and event processing software cannot generalize beyond this single industry/application.

The industry is, of course, capital markets, and the application is algorithmic trading

The diagram below is taken from the ebizQ customers survey (two years ago) about what are the business problems that they expect to solve with event processing, and the result is 9% indicated algorithmic trading.






This misconception is originated from the fact that the capital market industry has indeed been the early adopter of event processing software, and served as a proof of concept for the rest of the industries, there are indeed some vendors that focus mainly around this type of application, however, this does not show the entire picture. From the IBM experience I know of customers in various industries, most are not in the capital market area. Getting to the material collected by the EPTS use case work group (the material is on the EPTS members internal site, available to EPTS members) I find quite a lot of examples of systems working in production or being developed from variety of domains, here are some samples:
  • Border security radiation detection (Eventzero)
  • Mobile asset geofence (Rulecore)
  • Logistic and scheduling application (Starview)
  • Unauthorized use of heavy machinery (Rulecore)
  • Hospital patient and asset tracking (IBM)
  • Activity monitoring for taxing and fraud detection (IBM)
  • Intelligent CRM in banking (TIBCO)
  • EDA and asynchronous BPM in retail (TIBCO)
  • Situation awareness in energy utilities (TIBCO)
  • Situation awareness in airlines (TIBCO)
  • Reduce cost in injection therapy (IBM)
  • Next generation navigation (CITT)
  • Real-time management of hazardous materials (Oracle)
  • Finding anomalies in point of sales in retail stores (CA)
  • Elderly behavior monitoring (U. of Munich)
These are only samples, I am familiar with variety of examples in various industries: healthcare, utilities, chemical and petroleum, insurance, security, transportation and others. In the last event processing symposium in Trento, we had one keynote address on event processing in robotics, and there are other areas as well such as smart house to monitor energy consumption in the house. While we are now in the first generation, and the utilization of event processing will increase in time, the coverage in terms of industries and applications is growing has gone far beyond algorithmic trading.

More misconceptions in subsequent postings.