This is a blog describing some thoughts about issues related to event processing and thoughts related to my current role. It is written by Opher Etzion and reflects the author's own opinions
Saturday, April 14, 2012
On lack of monitoring
Recent financial news indicate that the Financial Industry Regulatory Authority has fined one of the analysts firms for failing to supervise equity research analyst communications with traders and clients and for failing to adequately monitor trading in advance of published research changes to detect and prevent possible information breaches by its research analysts. This is interesting, since it seems that the regulator now expects firms to monitor the consequences of their actions. This calls for real-time monitoring, monitor causalities between various events, and eliminate increased trading based on unpublished information. Recently, I have participated in a meeting with stock exchange people in a certain country, and heard about their efforts to detect undesired phenomena in trade. They also were looking at real-time monitoring, and even proactive behavior, trying to detect undesired phenomena before they happen. I guess that we'll see more of these applications from different sides - regulators, traders, and in this case analysts.
Monday, April 9, 2012
On event server as the 21st century application server
Paul cites TIBCO CEO Vivek Ranadivé in TIBCO's quarterly earning report, and concludes that an event server will is a requirements in many applications that process events in various ways.
Getting to the notion of application server (see illustration below taken from an article on Websphere Application Server)
Application servers are intended to support services to applications such as: transaction, storage, database approach, security, high availability, administration and more.
In the event-driven world there are flowing events, and with the Internet of Things, most data in the universe will be in form of events. In the event processing manifesto (that Paul has been one of the contributing members to its creation) we talked about "event fabric" which will enable Internet scale sharing of events and will support many applications. Some of the fabric properties mentioned were providing services of privacy, security, interoperability among fabric instances, provenance, energy efficiency, autonomic computing support (self-tuning etc...), availability, scalability, anonymity, non-repudiation, QoS with multiple criteria. These are some of the services, and there are of course functional services like context service, adapter and transformation service, filtering service, aggregation service, pattern matching service and more that should be built into the server and can be used by various applications from various application areas and types (BPM, CRM, Social computing, track and trace and many more).
Paul rightly notes that standards have key role in establishing such event server, Paul indeed wrote the standards chapter in the manifesto.
I think that the equivalent of app server based on events is inevitable since events will be at the heart of all applications that take sensor data an input. Work on standards in this area is an old dream, and hope that we'll be able to advance towards it.
Sunday, April 8, 2012
On two-tier analytics
Will Cappel from Gartner has written about two-tier analytics and went back to Immanuel Kant (in the picture above) as support to his thesis. Kant argued that the human cognition work in two levels: the first level that grasps objects and raw facts about them, the second level which captures causality between these objects over space and time, applying some levels of simplification to what Kant said, he is right. Cappel makes the analogy to the analytics world, and says that the first level is satisfied by event processing that process events by filtering, transformation and pattern detection to identify higher level situations. The second level is satisfied by pattern discovery engines that work on top of the first level. This is an interesting observation, I think that the picture is somewhat more complicated as there are more tiers. Event processing detect patterns in real-time, and indeed one of the ways to obtain these patterns are the pattern discovery mechanism over historical data, which may include the results of event processing systems, but should also include many other data items that describe the impact on the environment, since situation detection triggers actions, and actions impact the environment, the pattern discovery needs feedback from the outcome along with feedback from the process itself. The interesting part comes when we add real-time adaptation to the picture, here, in a similar thing to how the cognition works, the causality relations may change on the fly. Consider traffic management systems, studies show that these systems are chaotic in nature and one cannot forecast patterns of behavior based on past experience with sufficient accuracy, forecast is limited to about 15 minutes to the future in some cases and the control policies for highway should constantly adapt. Here we need four tier analytics:
The first tier is the off-line tier which change the setting of the system based on historical learning.
The second tier is the event processing tier which observes and monitors
The third tier is the real-time forecasting tier which adapts the causalities and make the short term forecasting
The fourth tier is the real-time decision making tier which makes the best decision possible within the time frame allocated for the decision (which may not be the global optimized solution).
Bottom line: I agree with Cappel about the multi-tier approach, and pointing out that reality is somewhat more complicated...
Subscribe to:
Posts (Atom)