Thursday, August 19, 2010

On event processing as a technology and as a business

Philip Howard, an analyst who covers event processing for several years now, has posted a blog entitled: what's happening with event processing, observing that event processing is getting integrated with other areas such as: BPM, data integration and more. This is not a new phenomenon; in the EPIA book, we mention that among the event processing trends is moving from standalone to integrated even embedded, and this trend is evident with the evolution of event processing as a start-up universe, to having bigger software vendors as dominant forces. However - will event processing as technology going to disappear? I don't think so. There is common functionality among event processing utilization in various industries, applications, and hosting technologies, in all of them there are functions of filtering, event transformation, aggregation, pattern detection, and routing. It is not cost-effective to re-invent the wheel for each individual use (although there are variations). This is a similar situation to databases; database can be used for various reasons, and also be embedded with various other technologies and products (e.g. application servers, BPM, system management products, messaging systems - all use databases), while there are also variations, it is not the case that each of these areas develop database technology in an ad-hoc fashion. Thus, I see event processing continuing to evolve as a technology, and having both research and development activities that build generic event processing tools. From the business point of view, there will always be some niche for event processing stand-alone applications, but as Philip writes, most of the market will indeed be in the integrated area, this fact already reflects on the event processing technology in terms of need for standards, interoperability features, and ability to have embeddable collection of building blocks and components. More about this - later.

3 comments:

Rainer v. Ammon said...

CEP has no business value "per se" like also CORBA or J2EE etc., it's a specifiation of a middleware platform or so and you can develop such a platform in different ways and in different quality, see also the discussions about the different EPL approaches.

As I or also some more colleagues said since some time (taken from the paper [http://www.citt-online.com/downloads/Criteria%20for%20measuring%20a%20BV%20CEP.doc] respectively from the EPTS Working Group Business Value website [http://members.ep-ts.com/index.php?title=Business_Value]):

"...

CEP has no Business Value per se, is like electricity, we must adopt it for a domain or an application.
- edBPM has also no Business Value
- Internet of Things and Services ditto

But:
- Smart Healthcare has a Business Value (Smart always means "based on edBPM")
- SmartFraudManagement in domains like Banking, Insurance, Retail etc. has a specific Business Value (bring numbers)
- SmartPlant/SmartSupplyChainManagement ditto
- SmartNavigation, SmartTransportation, SmartCity ditto
- SmartEmergencyManagement has a Value, but a Business Value?
- Smart AlgoTrading aka HFT (high frequency trading) has a Business Value, but a value/sense? (how long yet? http://seekingalpha.com/article/151173-hft-the-high-frequency-trading-scam)
- etc.

Problem:
- We need a domain specific metric for measuring the business value
- Can we abstract a general metric or formula for all domains/applications? (based on Greek letters, would be a nice subject for a PhD thesis...)

..."

We intended to discuss this on the DEBS 2010 in Cambridge, but unfortunately we could not make it because of some reasons of the contributors like no travel approval, missing visa, no funding because of the crisis, sickness :-(

I'll provide a typical statement regarding (business) value of CEP from the industry as a potential adopter with my next comment...

Rainer v. Ammon said...

(cont...)

I just tried to win Bull as one of the super-computing vendors as a panelist for our edBPM/U-CEP workshop in Ghent [http://www.citt-online.com/downloads/EDBPM-UCEP-ServiceWave10-proposal.doc ] andor as a contributor for our U-CEP textbook/course, chapter 14 [http://www.citt-online.com/downloads/Book-Ubiquitous%20Complex%20Event%20Processing.pdf]. Typically one of their directors said to me (mail was in German, here is a hopefully good-enough-translation of my translation system):

"Hello Rainer,

regarding CEP a lot was indeed now written all over the world and spun very much, and unfortunately it is also an unusual amount of charlatanism.

Can you provide a piece of executable SW for this and if so, what principles underpin the practical approach. If there is something, then that would be marketable. But so far nothing has turned up really well thought.

The purely mathematical-logical or informatic approaches will not lead to usable results in the medium term .
At least some non-linear statistical and normalized phase-space approaches for reel-number of data streams from physical systems must be added. Otherwise, all this would not work. So far mainly computer scientists and mathematicians are working on this stuff and surpass each other with ever more abstract formulations of logic. This is admittedly difficult enough and even necessary. But the world is first and foremost on our priority observation level neither logical nor purely discrete construction represented. (Yes, I know, it's not right from the quantum mechanics perspective, but in general I don't count this to our "priority level observation")

A machine for example runs with very continuous transitions through the phase space of their operating states. The prediction of a complex event would be in this case e.g. the simple need for maintenance, even for conditions that were never previously in this exact combination .

Even this simple example could solved only in a combination of methods.

And even if you analyze a bank and try to predict when the next disaster is likely to occur, then this is not possible with the purely logical-mathematical approach.

You can always tinker a numerically substantiated procedure that solves the machine maintenance problem or the question for an individual case when exactly is the optimal or right time when the switchblade must be lubricated.

That is not what would be marketable. Marketable would be a solution that solves the pure configuration task after feeding with data independently. So adjusting to the case. _THAT_ is a market, I'm sure. Something I've not yet seen. With the exception of ONE thing that I happened to look a little closer in, which is however in the discipline of "classified" . I do not know what this thing is really in the details, but I will know it soon.

So you have something in that direction??
..."

It should not be a problem to build such an _application_ based on CEP, we would "only" have to model the event patterns and which processes should be started when and how, according to our paper perhaps [http://www.citt-online.com/downloads/62750370.pdf], because is actually a combined challenge as Event-Driven Business Process Management or it could be done with the help of your EPIA book [http://epthinking.blogspot.com/2010/08/book-event-processing-in-action-is-now.html] based on an available edBPM platform like ... we know them and some of them would work already, without "Scharlantanerie". But would not be a product, would be a project, in this case especially for Bull, but more or less easily customizable

SJ said...

Hi Rainer v. Ammon!

I fully agree with your post. I just recently added some thoughts to this topic on Marcos last blog entry:

http://rulecore.com/CEPblog/?p=554#comments

The missing piece in the puzzle (also around USPs) is surely some kind of metric in order to be able to present the added value of CEP to a domain.

My personal opinion is, that it is possible to solve various problems in the domains that CEP is targeting at with other technologies as well. Therefore my ad-hoc guess would be a metric set up in the area of development, customizing or deployment speed. This ad-hoc idea is a long shot. Especially these kind of metrics would require some kind of empirical analysis (which is not so easy...)

I could also imagine some kind of metrics around low-latency decision making.

In my past I usually have set up CEP business cases around cost-reduction or value-generation based on the scenarios like "What would I save if I could detect situations faster" for instance" and put on the cost side the licensing costs and the efforts for the implementation/customization phase.

However, I don't think that it will be possible to extract some kind generalized metric or formular... but never say never. ;-)