As response to my posting about stand alone or part of a bigger picture, Marco made a comment that in the event-driven world there is an opportunity to provide building blocks that may be used in various platforms and part of multiple bigger pictures. This is a valid point, as event processing is indeed a collection of capabilities of various types -- transformation, aggregations, pattern detection and more, each of them can also be of various types --- e.g. supporting variety of aggregating functions, variety of transformation, multiple patterns etc, coupled with the fact that event-driven computing is decoupled, thus the interfaces to these components can be quite simple --- receive events and send events, can provide an opportunity to provide variety of such components, kind of plug-and-play event processing. The key, and currently main obstacle in letting it happen is standardization and current lack of standards.
- Interoperability standards are needed for standard interfaces
- Event meta-data standards are needed so that the events exchanged between components
- Languages and meta-modeling standards so people can model and program in a seamless model.
I believe that this is the right direction going forward, and it is a direction we in Haifa Research Lab are investigating; this might be the key to make event processing a pervasive technology - more about it, later.