(Click to enlarge)
Paul Vincent, who took upon himself to be the record keeper, posted a revised version of his famous market players with some additions. Over the last year I have encountered several more in this area, so if you are a player and want to be on the map, notify Paul and get into the picture.
Cloud Computing Journal published some citations from a podcast in which the analyst Dana Gardner, interviews Mahesh Kumar from AccelOps, about correlating streaming events with transient data to get real-time analysis of data in the context of big data, and cloud implementation. The applications that AccelOps targets are mostly availability and performance management and security. The idea is not new,it seems that there are various approaches and realization that data needs to be analyzed in real-time in order to get decisions, before being written to a disk.
This is one of the most famous visual paradoxes, in this picture one can see a young girl or and old woman, some people see only one of them, with some concentration one see both. This is a kind of relative view, in event processing there are some relative terms, I have written a while ago about the fact that the terms raw and derived events are relative, a derived event in one sub-system can be raw event of another sub-system..
There are other cases of relative views (an entity may be both consumer and producer, for instance).
I have reminded on relativism while reading an article about the keynote talk of Jeff Shulman (Gartner's manager of the application integration and web services analysts team). Shulman is talking about cloud, mobile and CEP as the leading trends for application development and integration.
As a remark to Shulman's keynote, the article bring an interesting response of Chris Dressler, VP Technology in Cablevision, he sees a CEP can be used to find and correct issues before the end user has the need to make a complaining call.
This is an interesting example, from the service provider's point of view (cable TV company in this case), this is a reactive application, tracking events that already happened and react to them, from the home consumer point of view this might be proactive, since the consumer may not yet felt the impact of the problem, so from the consumer point of view, this is elimination of problem that has not really happened. More on this distinction - later.
I am holding in my hand a copy of the brand new book by David Luckham, entitled: EVENT PROCESSING FOR BUSINESS. His first book "Power of Events" is the first book that opened the current era of event processing, which made which made David Luckham the prophet and elder statesman of the event processing community. My first meeting with David in early 2004 inspired me to think about the future of this area, and gave me some label and framework for what I was doing at that time. Some of David's ideas like event processing networks and event patterns found themselves as part of the area foundations, not exactly in the form that was defined in the "Power of Events" though.
The new book is aimed at being non-technical book, aimed at people in business and IT departments that want to understand what is event processing, and what are its uses. It is serves a similar target to the book by Chandy and Schulte. In comparison our EPIA book is aimed at more technical audience that would like to understand the building blocks of constructing event processing applications.
The book starts with chapter 1, which has the ambitious title "event processing and the survival of the modern enterprise" - explaining what most event processing is - and gives six principles of how it should be used by enterprises. Then it moves in chapter 2 to a history lesson - surveying all ancestors of event processing simulation, networks, active data bases and more, getting to event driven architectures.
Chapter 3 surveys the concepts that Luckham used in his first book, with definitions and some modifications to the original concepts. Chapter 4 is back to a history lesson - this time from the point of view of the commercial world. Here Luckham repeats his evolution classification that he has talked about in the past: simple event processing, creeping CEP, CEP as recognized technology and unseen CEP. According to Luckham we have just moved recently to the third phase (CEP became a recognized IT). The fourth and last phase is unseen (CEP goes behind the scene since it is ubiquitous and exists everywhere), it also becomes holistic, and in fact part of the infrastructure of every system from household automation to national cyber security. Chapter 5 views the markets - existing and emerging - and talking about industries and applications, with 13 examples (seems that the author is not superstitious!). Chapter 6 explain the notion of event patterns, here is goes more to technical, but stays mainly at the example level, neither trying to define pattern language nor talks about the implementation of patterns in current languages, Chapters 7 and 8 are entitled: "making sense of chaos in real time: part 1 and 2", with some examples and methodological insights. Chapter 9 is the last chapter entitled "the future of event processing" talking about the phases of evolution to the next phase, and some futuristic applications like:solving gridlock in the metropolis. The EPTS glossary is reproduced as appendix.
Overall -- good source of material and insights about event processing especially for the non-technical reader and good summary of the various talks that David has presented in the last decade. A must read for anybody interested in event processing.

I have been sick and stayed home for a few days, which gave me an opportunity to disconnect from the professional work and read "empire" and its sequel "hidden empire", both by Orson Scott Card, one of the my favorite writers, whose books are always thought provoking. The story is around the 2nd USA civil war, between right and left, where an history professor from Stanford, who dreams about American empire, following the step of the Roman empire, and concludes that like the Roman republic, the American democracy should be eliminated to achieve the empire, thus he orchestrates chaos, civil war, and some other actions, and takes over as an agreed upon president that both democrats and republicans nominate to restore order, later in the second book he takes advantage of a crisis to reshape some of the world and take another step in the empire vision. There are some questions behind the plot - whether the goal justified the means, since many of the book's hero characters, lose their life as a result of some of the actions, whether democracy is indeed a value (and if it is - whether today's political system is really democratic). I think that the clash between the ancient Plato's outlook that the individual exists to serve the society (Orson Scott Card's president is modeled after Plato's philosopher king as he writes in his epilogue to the books), or Aristotle's outlook that puts the individual in the middle and make society as means to serve the individual.
My own observation is that we gradually move from Plato's universe to Aristotle's universe, and that the current young generation puts the individual in the center. Societies of all types (including high-tech companies) will require to adjust to this new world. I have already written about the distinction between these two Greek philosophers, and will revisit this topic again.
Philip Howard, one of the analysts who follows the event processing areas for many years, recently wrote about "CEP and big data". emphasizing the synergy of data mining techniques on big data as a basis to create real-time scoring based on predictive model created by data mining techniques, his inspiration for writing this piece was reviewing the Red Lambda product. It is certainly true that creation of event processing patterns off-line using mining techniques and then tracking this event patterns on-line using event processing is a valid combination, although the transfer from the data mining part to the event processing part typically requires some more work (in most cases also involves some manual work). In general getting a model built in one technology to be used by another technology is not smooth, and require more work.
The synergy between big data and event processing has more patterns of use -- as big data in many cases is manifested in streaming data that has to be analyzed in real-time, Philip mentions Infosphere Streams, which is the IBM platform to manage high throughput streaming data. The data mining on transient data as a source for on-line event processing, and the real-time processing of high throughput streaming data, are orthogonal topics that relate to two different dimensions of big data, my posting about the four Vs summarizes those dimensions.

The last shipment from Amazon has brought me (together with the usual collection of science fiction and fantasy books) the book co-authors by Vivek Ranadive, the founder and CEO of TIBCO. I purchased this book to try and have some glimpse into what's going on in TIBCO, being one of the major players in the event processing market. However, this is not a book about TIBCO products, the book is of a type that I would call popular science, making the thesis that people who have been successful in several areas (the first example was Wayne Gretzky, claimed to be the greatest hockey players of all times) recipe of success was their ability to predict something that is about to happen before it actually happens and behave accordingly. Part I of the book analyzes several such cases in the human case. Part II of the book talks about the use of the same ideas in computerized systems and its utilization in several areas: making better wine, ending traffic jams, and explaining why nothing should ever break. In part III the authors talk about the concept of "the two-second advantage" and connecting it to event-driven technologies and claim that it will both make the world better and the human brain better. The book states a vision, I think that two-seconds is more a metaphor and not real time-interval, because for different scenarios more time ahead is needed, it also does not talk much about how to utilize the two-second advantage, since just knowing about things are just part of the picture. The idea somehow reminds of the movie NEXT, which actually talks about seeing two minutes into the future.
On the whole I found the vision in the book quite consistent with our own vision of proactive world, which is a major task I am trying to cope with recently, including the understanding of the cultural change aspect
In any event - interesting reading; it is refreshing to see that CEO of IT vendor can mentally release himself from daily life to write visionary books, maybe this is an implementation of Steve Jobs' seventh rule - sell dreams not products,