Saturday, December 24, 2011
Chris Carlson has written in the Informatica Blog about the fact that there is a growing segment of applications that are using event processing and are not really real-time, and stating the fact that their share in the event processing market is growing. I have written four years ago in this Blog about the term real-time and the abuse in its daily usage. When some people (and marketing messages of vendors) are talking about real-time, they mean "very fast", while real-time really means "within time constraints", the time constraint can be micro-second, second, 5 minutes, or 2 hours. Indeed, the early adopters in event processing, trading applications in capital markets, are based on low latency and fast reaction. Many other types of applications use event processing for the functionality of filtering-transformation-pattern matching (or continuous queries in the stream oriented programming style), and the non-functional aspects are secondary. The area which I am working on these days , proactive computing, has some applications in which there are real-time constraints, but typically not in magnitude of micro-second, but in seconds to minutes. This is the case where there is a forecast for a future problem (e.g. a traffic jam will occur in 5 minutes), there is a time constraint on activating an action (e.g. within 30 second there is a need to change the traffic lights policies to mitigate the traffic jam). This is a real-time application, but it has to react within 30 second, to impact in 5 minutes. The interesting thing is that low latency applications may be "best effort" and not have real-time constraints. Thus - there are low latency applications, real-time applications, those who have both, and those who have none. Interestingly, event processing applications can be found in all four groups.
Monday, December 19, 2011
The annual IBM predictions on "five in five" has been published. You can read more about it in the IBM Smarter Planet Blog.
This year’s predictions are: