Wednesday, January 30, 2013

Event model - what comes first - the event data model or the event driven logic model?



As promised, I'll write some issues related to event modeling, which is my current focus of interest.
The current thinking on event processing and actually on some related areas (such as BRMS), and also reflected in Luckham and Schulte's article on event models is that the first step is to build the data model, and then model the logic on the basis of the data model.     According to this methodology there is a need to model the used data and events, sometimes also modeling "business objects" that maybe created from the original data or events.   When modelling the logic it always relies on the existing events and data already modeled.  This is a "bottom up" approach.    

 A major issue with this methodology is that it moves the modeling immediately to the IT domain, since business users can relate better to goals, processes, and business logic then to data models.   Keeping the modeling in the level of business require to reverse the process,  model the business logic first, and leave the data and events as "place holders" which need to be completed later in the process.  

A top down modeling is also reflected in that article when the phase of "identify the complex events" precedes the phase of "identify base events".   I think that in the business user's terminology it is better to talk about situations that need to be reacted upon then on "complex events" that is more technical (and somewhat ambiguous) term.   Of course, modeling is an iterative process, since when getting back to the base events or data, a gap (e.g. event that is not being observed) might be discovered.   Thus it seems that the methodology should be:  start top-down (goal oriented) and tune up bottom-up.

More on event modeling issues -- later. 

Event processing with Storm-Cassandra from health market science

I came across a recent Slideshare presentation entitled "Complex Event Processing W/Cassandra". prepared by Brian O'neill and Taylor Goetz, from Healthcare Market Science.  It describes a project integrating Cassandra and Storm.   This presentation analyzes previous failures using Hadoop and C* with  aspectj.
Then it explains the architecture of the solution using Storm for the event processing part.   
Interesting presentation.  It would have been even more interesting if the presentation talked more about the actual application  (there is one slide explaining in very general terms what their products are).  
It seems that a lot is going on in the open source space.  

Monday, January 28, 2013

Event model - the next frontier

I am writing this time from Falls Church, Virginia.  Where I arrived yesterday for a business trip in the Washington DC area, and then in Toronto.   So far I was not affected by the weather, but we'll see what happens later in the week.     

The slides above is an old slide which I used seven years ago to show the relationships of events and the rest of the universe.   I was reminded on that when reading a new article by David Luckham and Roy Schulte entitled "Why companies should develop event models?".    In this article they explain the motivation and provide a high level methodology to build event model.     Following what I recently written about "event oriented thinking",  it seems that the major obstacle to adoption of event-driven systems is the ability of people to think in event-driven way and not in the traditional request-driven way.   Being able to create event model and relate it to enterprise computing modeling is an essential step.    This is the direction I am working on these days (very much in the direction that Luckham and Schulte indicate), and will surely write more about it in the sequel.