Saturday, October 11, 2008

More on semantics and race conditions

In previous posting I have posed the following sceanario:



Given the simple application shown below:




  • There is a single event source (so no clock synchronization issues) which generates events of three types e1, e2, e3.

  • Let's also say that in our story there is a single events of each type that is published (so no synonyms issues), the table shows their occurrence time (when they occurred in reality) and detection time (when they have been reported to the system) - each of them has been reported 1 time unit after its occurrence, no re-ordering problem.

  • Events e1, e2 serve as an input to an EPA of type "pattern detection" which detects a temporal sequence pattern "e1 before e2", and when this is detected, it derives an event e4 - some function of e1 and e2.

  • Events e3 (raw event) and e4 (derived event) serve as input to another EPA of type "pattern detection" which again detects a temporal sequences pattern "e3 before e4", if this pattern is detected - create event e5 which triggers some action in the consumer.

I also asked the question is -- given the above - will the action triggered by e5 occur?, i.e. will the pattern - "e3 before e4" be evaluated to true?

I got a few answers to this and you can read them as comments to the original posting; as promised I am dedicating this postings to analysis of this simple case:

The first thing to discuss is the semantics of "temporal sequence". There are two possible types of semantics for temporal sequence, which I call "detection time semantics" and "occurrence time semantics".

  • The detection time semantics is implemented in various languages and means that the temporal order is the order of the time-stamps in which the "event processing platform" detects that this event occurs; if there is a single thread of such detection, then the events are totally ordered, otherwise, there may be several events with the same "detection timestamp".
  • The occurrence time semantics also implemented in various languages means that the temporal order is the order of the time-stamps that are provided as part of the event information, and designate - when this event happend in reality. There are some complexity of synchronization of time in multi-producer environment, however, in this example we assume a single producer (I'll write about multi-producer cases in another posting).
  • Note that this two order relations may not be identical.
  • There is also kind of hybrid solution ("total order semantics") -- the semantics is really "detection time" semantics, but in order to allow events that arrive a bit late to take their proper role, the events are queued at a buffer (and not considered as detected) until time-out to let "out of order" events to arrive and re-order the buffer, and then send the events according to the buffer order.

Getting back to the example - in the small table on the bottom left-hand side of the figure above, there are occurrence and detection times of e1, e2, e3. For e4 there is only detection time - e4 is different from {e1, e2, e3} by the fact that it is a derived event and not raw event like the other three. The question is "what is the occurrence time of a derived event" ? -- there is no clear answer for it - there are several possible answers:

  • In the derived event case the occurrence time = detection time, since this event is not real event but a virtual one, thus, its source is the EPA that creates it, and it occurred when created. In our case it means that occurence-time (e4) = 4.
  • Its occurence time is the occurrence time of the last event that completed the pattern - since the participating events in the creation of e4 are {e1, e2} and e2 was the last that completed the pattern, occurrence-time (e4) = occurrence-time (e2) = 2
  • Interval semantics: The event e4 occurs in the interval in which all the participants occur, which is this case means occurrence-time (e4) = [1, 2].

The phenomenon of multiple semantic interpretations apply to various other semantic decisions in the semantic of event processing language, and the preferred solution is to provide the user with semantic "fine tuning" policies, under which the user can chose the desired semantics, instead of "hard code" a certain semantics (using the most common one as a default), this is one of the benefits of using COTS for event processing, since it is quite difficult to think about such issues when developing EP manuaully using conventional language.

The semantics of the second "temporal sequence" (e3, e4) is thus:

  • According to "detection time" semantics -- both have detection-time of 4. As such the sequence condition is not satisfied. However, if we impose total order by a single thread, this may create race conditions between the two events. In this case it is recommended to use a consistent priority policy - either breadth first (the raw event always comes first) or depth first (the derived event always comes first) to ensure deterministic result.
  • According to the "occurence time" -- it depends on the policy chosen, but according to all interprerations - e4 occurs before e3 - thus the temporal sequence is not satisfied.

Bottome line: the temporal sequence (e3, e4) is satisfied if:

  • The temporal semantics is detection time
  • It is implemented by total order
  • The total order policy is "breadth first" - namely priority for the raw events.

In all other cases the temporal sequence will not be satisfied and the corollary action will not execute.

Wednesday, October 8, 2008

On HITC and some small stuff


I have written about a month ago about the Arab Israeli High Tech Center, whose first goal will be to convert Arab Israeli engineers and mathematicians, to work in the very demanding Israeli high-tech industry. Earlier this week there was very impressive ceremony of kick-off for the center (that somehow migrated from AITC to HITC), in the two picture above - the top one is a group photo of the management and industry advisory committee of the center, and in the bottom one you can see me sitting in the crowd, with my eldest daughter, Anat, that decided to come and watch. This project is highly supported by the Israeli High-Tech industry and some of them included their country general managers -- like: HP, Oracle, BMC, EDS and Matrix (and Israeli services company), and other companies like - IBM, Motorola, Intel, Microsoft, Checkpoint and more, were represented by a senior person, there were several hundred people present, and was quite impressive -- studies will start in February 2009, and still a lot of work to be done to make it happen, but people came in feeling of a history in the making.


And back to event processing -- in the next posting I'll talk about the semantic question I have posted last week, and meanwhile just some short comments:

  • Mark Tsimlezon from Coral8 tries to define what is "CEP engine" stating that there is some confusion in the market about this. I almost agree with what he has written and wondered if I should react, since my reaction can further confuse people... So I'll just remark that the term "platform" starts to be very popular, but with somewhat different meanings. I'll write more about platforms in the future.
  • Marc adler is blogging about MSFT Oslo and his CEP application - without going now to further details I believe that the direction of having the ability to interface in the user's domain terminology and way of thinking, and then map it automatically to an execution language (directly or through intermediate representation) is a correct idea, somewhat beyond the state-of-the-art today; there will probably be several ways to do it, but a good topic to work on.
  • Marco from RuleCore is blogging about the pain of their SAAS model and mentions some obstacles, this is a good topic for further discussion, it was also presented in the EPTS meeting by Bob Marcus. Clear, there are some applications that can be served by this model and some (e.g. distributed applications) are not. Will discuss this issue in length in one of the coming postings.

More topics - Later

Saturday, October 4, 2008

On Event Processing Network and Transaction Processing






It is a holiday period, time in which we have four holidays during three weeks, and is quite a lazy time here, with many people taking vacations (like the second part of December - beginning of January in the countries with christian majority), due to holidays and some other events I'll see my office in the coming week only on Tuesday, but working a bit from home now...


In an IBM internal Email exchange this week with a person who does not really understand event processing, this person has seen some illustration of EPN (Event Processing Network) and wondered -- this seems like regular transaction processing ? what is the difference ?

Indeed - from bird's eye view everything looks like directed graph, such as the one shown in the top of this page, both transactional flow and EPN as well as many other things are expressed using a directed graph, however there is a major difference in the semantics of the graph.

In order to refer to a concrete example, let's take an EPN example taken from an application of remote patient monitoring.






The semantics of EPN means that a node in a graph creates events and then these events are consumed by other nodes in the graph, for example the "enrich" node takes a blood pressure reading and enrich it with indication whether the patient is diabetic, thus creates a derived event; this derived event is consumed by the node that is looking for pattern to alert physician. Without going to the application's details too much - we may also state that unlike a control flow, the pattern detection node does not start its execution when all of its predecessors have finished, since the pattern may look at multiple blood pressure measurements of the same patient, it may exist for longer period relative to the enrich node that is created and measurements anytime that there is a blood pressure reading of a new patient, so the graph does not show the control flow, moreover, these two nodes don't know each other and communicate through a router (channel) node. So there are some differences between event processing network and transactional flow:

  1. The EPN graph does not represent control flow, but event flow.
  2. In a control flow graph, typically the relationship between predecessor and successor nodes are "finish to start" (either "meets" or "after" in the Allen's operators that I'll discuss in a seperate discussion) which means that the predecessor's node must terminate in order for the successor node to start, in EPN, this may not be the case.
  3. EPN does not necessarily be atomic (one node in the EPN may fail, but others continue - no "atomic commitment protocol" (e.g. 2PC) is applied
  4. It also may not be isolated -- a node can emit events, while still continue working; even if it fails later - its emitted events may still be valid, if atomicity is not required.
  5. EPN can be restricted to behave in a transactional way - this is an interesting observation, as transaction support violates the decoupling principle, however there are cases in which it is required (again, deserves some more discussion). More - Later.

Friday, October 3, 2008

On the Genesis and Exodus in Event Processing


One of the greatest scientists I had the honor to meet in person (in a conference in France, 1991) is Lofti Zadeh the inventor of "fuzzy sets" which is one of the major ways to formulate inexact thinking. When I was an undergraduate student, there was an urban legend that Zadeh came with the fuzzy notion when his wife went out of town and left him a cooking recipe, trying to formalize the recipe he came out with the notion of fuzzy. Later in life I've met another great scientist and wonderful person, the late Manfred Kochen, who told me overa lunch in Ann-Arbor, that he has been a graduate student together with Zadeh in Columbia University; so I told him the urban legend and asked him if it is true, he was quite amused to hear it, and said that the problem that actually started the thinking about fuzzy theory was - formalizing the process of parking car between two parking cars, assuming the Fred Kochen told me the truth, was the genesis of fuzzy logic. It was interesting to observe that Tim Bass, in a couple of his latest Blog postings, have returned to the genesis of "complex event processing" citing topics that emerge from the papers that David Luckham's group in Stanford published in the late 1990-ies - the list contained:
  • Network Level Monitoring and Management;
  • Cyber Security: Network Intrusion Detection;
  • Enterprise Monitoring and Management,
  • Modelling and Simulation of Collaborative Business Processes;
  • Business Policy Monitoring;
  • Analysis and Debugging of Distributed Systems.

These applications are all still very much alive and kicking in the event processing space.

It is interesting to note that the genesis of data stream management in one of the earliest papers of the "stream" project, has been, surprise, surprise -- "network traffic management". It also should be noted that David Luckham and Jennifer Widom reside in the same building.

As the area of event processing have many ancestors - they have some more genesis books, for example, the term "active database" was first coined by Morgenstern in his VLDB paper from 1983 , and the genesis of Morgenstern has been - consistency and integrity applications. We still see compliance and governance (our current names) as major applications. Other ancestors are in the area of system management whose genesis has been the "root cause analysis" application - i.e. diagnostics of problems out of symptoms. We in the AMiT project in IBM Haifa Research Lab started also with looking at system management applications, and what is now called "business services management" - impact analysis of events in the IT on business processes. I think that at least some of the pub/sub companies started with distribution of new versions of software to subscribers, and of course some of the current event processing vendors started with applications like algorithmic trading in capital markets.

If we have used the biblical term genesis, we also may remember that the successor of "genesis" is "exodus", and in our term -- moving on and not staying just where we happened to start. While some of the software industry is based on niche players, where the niche may be quite big (one of the biggest IT companies in Israel has concentrated for many years mostly in the area of Telco billing, probably big enough to enable niche companies of several thousands employees), however, for more basic software like event processing tools, there is a big benefit in the ability to generalize beyond the genesis, and indeed we see now that some vendors are going after other markets that may seem beyond their "comfort zone" and need to make some adjustments (this phenomenon may be one of the drivers for standardization in this area, but I'll discuss this issue in another time), thus, we are watching growing list of applications and business problems that event processing can be part of its solution, both in the infrastructure area (which should grow to internet scale infrastructure) and the enterprise application area. To conclude this posting with citing another great speaker, Professor Stu Madnick from MIT, whom I remember giving an amusing talk about theoretical computer science saying something like: A bunch of people went to a close room taking with them some problems from the outside world, and since then they are still in the same close room, still working on the same problems, and sometimes inventing new problems . Well - we shall still solve the original problems, but also look around to find new ones, we are just in the early days of the event processing area, and probably did not discover much of its power to impact the business world. More - Later.

Monday, September 29, 2008

On Semantics and Race Conditions - introduction





In this Blog posting I'll touch upon an issue that requires some attention to the exact semantics.

I'll introduce the topic today -- wait a few days to see if there are comments - and then post the analysis of this case.


Given the simple application shown below:



Let's explain this simple example, since I would like to concentrate on a single issue, I'll simplify all other things to eliminate any noise.


  • There is a single event source (so no clock synchronization issues) which generates events of three types e1, e2, e3.

  • Let's also say that in our story there is a single events of each type that is published (so no synonyms issues), the table shows their occurrence time (when they occurred in reality) and detection time (when they have been reported to the system) - each of them has been reported 1 time unit after its occurrence, no re-ordering problem.

  • Events e1, e2 serve as an input to an EPA of type "pattern detection" which detects a temporal sequence pattern "e1 before e2", and when this is detected, it derives an event e4 - some function of e1 and e2.

  • Events e3 (raw event) and e4 (derived event) serve as input to another EPA of type "pattern detection" which again detects a temporal sequences pattern "e3 before e4", if this pattern is detected - create event e5 which triggers some action in the consumer.

The question is -- given the above - will the action triggered by e5 occur?, i.e. will the pattern - "e3 before e4" will be evaluated to true.


Before getting to the analysis -- I wonder what will be the results in current EP solutions:



  1. The action will always be triggered.

  2. The action will never be triggered.

  3. The behavior is non-deterministic (sometimes yes and sometimes no)

  4. Any other possibility (specify).

Please send it as a comment to this post, I'll publish an interesting analysis of this case next week.


Happy New Year.


Sunday, September 28, 2008

On the scope of event processing as a discipline again


Back home... short work week due to the Jewish New Year holiday (tomorrow is the holiday eve).
One of the topics that were not discussed in the EPTS meeting is - "what is CEP?", an indeed EPTS is looking at "Event Processing" as a discipline, where "Complex Event Processing" - no matter how it is defined, is a subset of a larger whole. One of the discussion points is to define the scope of the "event processing" discipline (some people prefer to call it "event-based systems" but we are talking about the same thing), I have already written in this Blog about event processing as a discipline before, talking about some interesting subsets.

As one interesting source, let's look at the scope of DEBS 2009:

Event-based systems are rapidly gaining importance in many application domains ranging from real time monitoring systems in production, logistics and networking to complex event processing in finance and security. The event based paradigm has gathered momentum as witnessed by current efforts in areas including publish/subscribe systems, event-driven architectures, complex event processing, business process management and modelling, Grid computing, Web services notifications, information dissemination, event stream processing, and message-oriented middleware. The various communities dealing with event based systems have made progress in different aspects of the problem. The DEBS conference attempts to bring together researchers and practitioners active in the various sub communities to share their views and reach a common understanding.
The scope of the conference covers all topics relevant to event-based computing ranging from those discussed in related disciplines (e.g., coordination, software engineering, peer-to-peer systems, Grid computing, and streaming databases), over domain-specific topics of event-based computing (e.g., workflow management systems, mobile computing, pervasive and ubiquitous computing, sensor networks, user interfaces, component integration, Web services, and embedded systems), to enterprise related topics (e.g., complex event detection, enterprise application integration, real time enterprises, and Web services notifications).
While this is not a definition that I have phrased, it shows that the discipline is diverse, and has touch points with some other disciplines (software engineering, databases, sensor networks, embedded systems etc...). It is also interesting to note that the applications presented in the EPTS use cases group were also diversified: we have seen applications from - Finance and Defense (not surprising), but also from - Media and Entertainment, Chemical and Petroleum, Telco and emergency management.
The event processing discipline crosses several aspects - modeling, architectures, languages, engineering aspects, performance and optimization, user interfaces, intelligent components, and domain-specific additions - again, all of these in the context of creating specific platforms and tools for building event processing applications. In the next few postings I'll return to some micro-level issues I have faced in the last few weeks.
Happy New Year.

Tuesday, September 23, 2008

event processing meets artificial intelligence




Bedford, MA, USA.




In the EPTS symposium last week, Alan Lundberg from TIBCO, who moderated the "business panel" made the analogy to AI, especially to "Experts systems", saying that there was a hype in the beginning, and people believed it will solve many of the world problems, and in the reality, it did not recover from sliding down in the hype cycle, this triggered the (somewhat surprising to some) response of Brenda Michelson, that actually EP is under-hyped, and its place in the hype-cycle is much lower in the climbing phase than the Gartner analysts draw, this is the diagram that Brenda presented with "event processing" in orange, way below SOA (in blue), BPM (in red), and Web 2.0 (in green).






Anyway - this is not the topic of today's Blog, but going back to the AI issue. The term AI is interesting, in the sense that it has spawned several disciplines (e.g. robotics, image processing, information retrieval, data mining and more) which are based on AI principles, but when they mature they stop being AI and become disciplines of their own. This is the same phenomenon we have for philosophy - the mother of all arts and sciences - many disciplines has emerged from philosophy, but when they depart, they are not considered as philosophy anymore. Event processing as a young discipline, is a descendent of multi disciplines as stated in the past, AI is certainly one of them.




What are the current topics in which AI touches event processing?




1. Modeling: the basic term "situation" and "context" have been taken from AI (situation calculus), conceptual modeling is important for design of EP applications, AI techniques can help here



2. Discovery: Prediction of events, mining of patterns - these are all derivatives of machine learning in AI.




3. Reasoning: Defining precise semantics of both event processing languages and execution models. Evidently from the recent discussions in the community, this becomes an important topic - again, precise reasoning of both the regular case of event processing, and the extended case of handling uncertain events.


As my colleague Guy Sharon described in the research session of the EPTS meeting, we in IBM Haifa Research Lab (together with some colleagues in IBM Watson Research Center) are engaged in the "Intelligent Event Processing" project that concentrates now on the discovery aspects, however, the idea is to extend the activity probably through collaborative work with the academia, as part of this collaboration we are organizing the "Intelligent Event Processing" workshop which will take place as one of the AAAI spring symposium series that will take at Stanford University, March 2009. The idea is to have the EP community meet the AI community and create partnerships to deal with these issues... so - target this conference for paper submission and/or attendance. More - later.