In the Dagstuhl seminar on event processing that ended 10 days ago we launched the "event processing grand challenge activity". You may be familiar with the DARPA grand challenge of driving driver-less car through the desert, you can see the two winner cars in the pictures above. We are not DARPA, and don't have money to distribute, but we would like to take the opposite direction, first define the grand challenge, and then convince funding authorities that they want to support it.
Why do we need this grand challenge? The research community has incubated the "event processing" area as we know today --- some research projects in the 1990-ies, such as: David Luckham's Rapide in Stanford, Mani Chandy's Infospheres in Caltech, John Bates' Apama in Cambridge, and our own Amit project in IBM Haifa Research Lab, followed by the various stream processing projects, like Jennifer Widom's stream project in Stanford, and later the Aurora project, to name a few (there are many more, of course).
The state of the practice is now in the hands of the software vendors, and event processing is becoming part of the main-stream of enterprise computing. However, software vendors are by nature advancing technology in an incremental fashion. On the other hand, there is a strong feeling that "event processing" has barely scratched the surface of its potential to impact society, this goes beyond current applications, and even beyond enterprise computing as we know it.
It is now the role of the research community to jump-start and incubate the step-function required to be achieved in order to get this kind of impact. We would like to have a call to the research community in event processing to focus on such a grand challenge, and as an incentive (and enabler) to get funding agencies worldwide to adopt it.
A substantial brain power have been invested in Dagstuhl, and in a follow-up to work on it, there have also been some people outside the community that deal with socio-technical systems and bio-informatics.
As a metaphor we can view many type of IT systems, social systems and biological system as a "live ecology", similar to a single organism with a lot of brains, eyes, ears, hands and feet.
Event processing serves as the nervous system, and events are what flow between the different players. This subsumes the "Internet of things" vision, where many sensors are connected, and also robots of various kinds as actuators, serving as hands. Such an infrastructure will enable changing the life as we know them.
Event processing serves as the nervous system, and events are what flow between the different players. This subsumes the "Internet of things" vision, where many sensors are connected, and also robots of various kinds as actuators, serving as hands. Such an infrastructure will enable changing the life as we know them.
We are working now on various scenarios that will be enabled by such an infrastructure; you can hear some thoughts about it in my talk next week in the OMG Event Processing community of practice that holds its first event processing symposium. Stay tuned for much more on that topic.
5 comments:
There are two areas in EP that I see as needing further development or consolidation. The first area is predictive event processing. This is a combination of both stream based processing as well as data mining. You would need to be able to set up some form of decision / weight tree in which it would influence the rule defined. Think along the lines of an expert system combined with a event processing system / stream system.
The second area is more of a global way of defining a EDA architecture. An EDA architecture takes into account all levels of events from Infrastructure Event Streams, Business Event Processing and Business Process Events (one being static rules and the other is linear time based events). As you see EDA is more than one component and needs to be thought of in a end to end process.
Additionally there is one final (so there are 3 instead of the two) area which definately needs some form of work which is how do we model the whole process. RUP doesn't really handle the whole event process and neither does UML, BPEL or other modeling methods..
Would love to hear thoughts on these.
Chris
What Chris is saying is actually one of the challenges of Ubiquitous Complex Event Processing (U-CEP, http://complexevents.com/2010/05/16/ubiquitous-complex-event-processing-u-cep/).
We must enhance UML by a new modelling notation, activity or state diagrams for the modelling of dynamics will not be appropriate in order to model the dynamics in the field of Epigenetics, cell biology, global emergency management, Brain research, or complex societal dynamics of billions of agents as sketched in the European FET flagship proposals like S-Gaia, Integral Biomathics etc. (http://cordis.europa.eu/fp7/ict/fet-proactive/flagship-ws-june10_en.html#agenda). Of course BPMN and BPEL have not the right features for such aims. These will be nice subjects for some PhD theses, also to define the appropriate EPL... Because CEP is always deterministic (we react on known, defined event patterns), we must combine CEP with approaches for non-deterministic methods like from AI, neural networks, Bayesian belief networks, diverse combinations etc. what is already usual in the field of fraud management, counter terrorism etc. (e.g. typical university seminar https://pi1.informatik.uni-mannheim.de/filepool/teaching/fraudseminar-2010/seminar-ss10.pdf)
Rainier,
This feeds into what I personally believe is the next big thing in EP which is as I said earlier Predictive Event processing.
I see that expert systems and the like would feed event streams into the PEP where patterns will be looked for. Another good example to use is failure detection in hardware systems.. But again this becomes a difficult task when you are unable to describe or model the process from end to end.
There are kludges with BPEL which can be done to model the process but they really don't capture what needs to be done or the events.
Another challenge I see coming over the horizon is the sheer volume of events which need to be processed. Thinking on this I believe some form of federation would need to be set up for any real time event processing to be done and a longer running event process (which should feed into the PEP systems) would be done in a more batch oriented manner.
The long term pep system is looking at the patterns of the data not the raw data. (Being Autistic this is how my brain processes things. So this makes sense to me but not sure of others).
Chris
Hello Chris and Rainer.
There are various areas that need further work, some of them relate indeed to the use of AI/statistical reasoning such as pattern discovery, and discovery of event causalities for proactive reasoning. There is already some work on the area of inexact/uncertain event and processing, e.g. the PhD thesis done by Segev Wasserkrug at the Technion, which we intend to continue work in IBM Research. As for modeling -- this is another area that needs work (and standards). In my talk next week at the OMG virtual conference I will talk about some of these areas, and as Rainer notes they are all required to realize the grand challenge we are trying to push.
cheers,
Opher
CEP as offered today misses some core concepts. Complex events are not standalone data patterns in some environment. They consist of trigger, signature, relevance, context and meaning.
Therefore the current ideas about doing events in process management have nothing to do with 'complex events.' They execute simple if/then trigger rules and interrupt an ongoing process. No more.
CEPs can only be sensibly handled by pattern-matching machine learning that identifies all five CEP components and modifies existing processes to deal with such events.
The Papyrus Platform User-trained Agent (UTA) has that capability within the process/application environment and that does support UML with an embedded state/event mechanism, so it fulfills all requests mentioned.
I as such don't see the need for some PhD thesis to research this ...
Post a Comment