Tuesday, March 31, 2009

On the next generation of event processing

I was asked by several people to post the presentation I have given recently in several places - the presentation including some views about where should we go in the next generation of event processing, what are the challenges in the various areas (the above illustration is a slide classifying the challenge areas) and a survey of some activities we are doing either in IBM Haifa Research Lab or with graduate students at the Technion. The presentation is now on slideshare;


Hans said...

I notice that the definition of event in the slides is different from in your manuscript (the manuscript does not mention atomic). :-0

Opher Etzion said...

Hello Hans. You are right, the slide with the event definition is legacy from an older presentation, The reason that in the manuscript I have omitted the "atomic" is due to its ambiguity. People tend to interpret atomic as a structural statement (but event can also have a complex/composite structure), my intention in the "atomic" has been like the transaction term of atomicity --- which is "happens completely or not at all" (which does not mean that we always know whether the event happened or not).



Hans said...

To me, the most important quality of events (in a model) is that they are atomic (in the transaction sense). This property sets a very fundamental and important guide in modeling. Were I to offer only one piece of advice to a new event modeler, it would be to keep events atomic.

I also agree that it is very important to understand the difference between the event model and the model of the data (class, structure, etc.) that represents the event.

Arturs said...

Really liked that this presentation has 'Lessons from Relational Databases' slide.
EP area is different (even though there obviously was misconception that 'it's the same - just apply queries over streams with some windowing to define sets') but in terms of evolution and getting to the right level history might eventually repeat itself.