Showing posts with label event processing application development. Show all posts
Showing posts with label event processing application development. Show all posts

Friday, December 17, 2010

Who is the developer of event processing applications?



One of the topics is is frequently discussed recently is -- who should be the developer of event processing applications?  a computer programmer (the top picture) or a business analyst - the bottom picture taken from a site called "business analysts mentor",  shows what are the business analyst skill.   Will a future list will also include - event processing development?    


In the BRMS area, one of the claims is that business analysts can develop, maintain, modify and manage rules.  
There is still need in programmers for setting up the data, connect it, deploy it etc...


The desire to have business analysts develop event processing applications exists in the industry,  there are surveys indicating that most users want it,  and it combines with a general trend in enterprise computing.


Getting to event processing,  there are still challenges in getting business analysts developing event processing applications,  the challenge stems from the fact that the possibilities in event processing are quite wide, and thus there are two main options:



  1. Restrict the expressive power and provide the business analyst an "easy" sub-language.  This may be enough for some classes of applications, but not enough for others
  2. Provide assisting tools for business analysts to cope with the entire capabilities of the event processing language (e.g. patterns, policies).   This requires both the right abstractions (intention language) and also tools to validate that the system is working properly.     


Why it is not easy?    first -- the semantics of the event processing various functions needs understanding, while it is not a technical difficulty, it requires understanding and training,  second -- understanding the flow, i.e. the interactions among the various functions adds to the complexity.  
It is doable,  but requires more work on setting up the right tools.     More on this  - later. 

Friday, June 4, 2010

On business user's computing

Friday is our shopping day (our working week is Sunday to Thursday, thus Friday and Saturday are weekend days here); today in a coffee shop in our neighborhood shopping center I have noticed a sign: "We don't accept checks or credit cards", they meant that they neither accept checks nor credit cards, obviously, but if we'll feed this sentence to a computerized program that parses logical propositions, I am not sure what the result would have been. There is a famous experiment in which they asked people to formalize a query about people who live in New-York or in Philadelphia, and many people formalized this query as people who live in New-York and Philadelphia, which reduces the answer to a person who happens to have two addresses in both cities. I am mentioning it, since recently I had some discussions about the ability of business user to write their own: {rules, patterns, queries} - chose your favorite terminology. There are users which I call "semi-technical users" who do not know how to program in any programming language, but can be taught to formalize their requirements in an accurate way. However, there are users who are not able to do it, and giving them natural language interfaces will lead to unexpected results.

So - should we give up on that? not necessarily, there are several ways to advance here, one of them is to have parameterized rules/queries/patterns, and let the user instantiate it by entering parameters. This, of course, limits the scope of programming to the predefined patterns that have been created by the IT developer; another way is to use some kind of dialog with the user to interview the user about the precise meaning, this area needs more work, but it is vital in extending the scope of those who can program event processing (and also some tangent areas like business rules) applications. More about it - later

Sunday, June 28, 2009

On hard coding event processing functionality

Today, again, I did not celebrate my birthday. I don't have a habit to celebrate a reminder to the fact that I am getting older. Interestingly, I got today much more (relative to recent years) birthday greetings in various communication ways -- E-Card through the Internet, phone calls, Email and even Real-time messaging. The reason that I got much more greetings relative to previous year can be attributed to the fact that I made many friends in the last year, but the more realistic reason is that certain social networks send people reminders about birthdays of other people in the network, which makes the knowledge about the birthday more accessible. Well -- none of my family remembered my birthday, here, at least, no change from previous years.

One of the many meetings I had last week was a teleconference with some IBM customers (I shall not expose their identity), my role in IBM is not in sales or sale support, but I am being called from time to time to participate in meetings with customers, by request of the people handling such meetings. The major issue that they wanted me to discuss is the benefit of using a middleware software for event processing (in the large sense) vs. hard coding it into the application code itself, like this customer is used to do.

Indeed, this is not a clear cut issue, there are two cases in which hard coding the event processing functionality make sense: either in the case that it is very simple, and thus it is not cost-effective to purchase, learn and deploy a specific product, or in the case that the functionality is too specific, and not covered by products, furthermore, it does not represent a ubiquitous requirement that makes it cost-effective for vendors to support it.

There are also some cases in which it is reasonable to write some functions in Assembly languages (even I did some of this in the past), but typically (most) people prefer to code in higher level language.

Likewise, in many applications, satisfy none of these conditions, and for these application is cost-effective to use generic software. The reason is that there are various common functions (e.g. filtering, routing, enrichment, transformation, pattern matching) that are repeating. Hard coding them means -- re-inventing wheels over and over again, instead of reusing existing implementations as "services", and enjoy other people's work that is being upgraded and optimized with time. This is similar to the reasoning of using other generic products -- messaging, workflows, databases, adapters, development environments and others.

The reaction of this particular customer was interesting: "what you are saying makes much sense, currently we are not used to think in terms of separating the event processing from the rest of the application logic, and we need to digest the idea". So, there will be a follow-up meeting to continue discuss it. I have seen this kind of reaction before, I think that a challenge of event processing vendors may be the competition with potential customers who do not understand the benefits above hard coding. But, this was also the case for other technologies that succeeded to cross this bridge. The growing number of event processing customers indicate that this thought is getting traction.

I think that this is also a topic for a community effort, that may be pursued by EPTS. More on this topic -- later.

Friday, March 6, 2009

On event processing engines and platforms


Today, Friday, is part of our weekend, so it is a good time to do shopping and other arrangements.
My wife and myself went to our local friendly bank to open some new account for some purpose. The lady that handles our account said that they have a new software to open an account that is extremely difficult to operate, with a lot of screens that one has to understand what is asked, and suggested she'll do it off-line and call us when ready, so we'll come to sign the papers. Once, opening an account was simple and lasted a few minutes, just signing some forms; the more sophisticated a software becomes, the more difficult it to operate, and sometimes it becomes obstacle to the business. Often, developers don't really care about the human engineering aspects. Hans Gilde wrote recently about the fact that CEP software is not smart. I agree, in several occasions I have given talks to an audience of high-school students which gives a rough introduction to AI, under the title: can a computer think ? while there some works in AI that strive to do it, today's software does cannot really think, and is not really smart. One can use the software to do things that look smart, but the wisdom is not in the software itself, it is in the way it is used. In the bank case, the software does not even look smart...

This week I had three visitors from Germany, Rainer von Ammon and two of his CITT colleagues, and we made some progress towards defining the EDBPM project that we plan to submit as EU project. They have asked me to pose in my office under my " wall of plaques" (half of them are in Hebrew, so they could not really read them...). So this is my most current picture..




One short clarification -- after my posting entitled : "event processing platforms - yes, but..."
I received some private communication claiming that there is a confusion between the terms "platforms" and "engines". The claim is that there are vendors who refer to their engines as platforms, moreover, some people refer to any run-time software as an engine. So I thought it worth clarifying how do I see the distinction:
  • Event Processing Platform is a software that enables the creation of event processing network, handle the routing of events among agents, management, and other common infrastructure issues.
  • Event Processing Engine is a software that enables the creation of the actual function - in the EPN term implementing agents.
This is similar to the difference between an application server and a single component.

What is the connection ---
  • On one extreme, there are closed platforms, i.e. platform that can run only one type of engines, in this case the distinction becomes more fuzzy.
  • On the other extreme -- there are open platforms, in this case these concepts are totally separated, a platform that can run multiple engines. The main issue about it is that there may be a collection of different languages that come with the different engines, and this may make the development of an application more difficult.
The first generation of event processing has started with engines that are stand-alone, the emergence of platforms, and making them open, are the signs of the second generation. I'll say more about the challenges of constructing the next generations -- more later.

Friday, February 20, 2009

On static and dynamic event flows


This picture shows static and dynamic flows in an interesting picture I have found in one of the Web albums under "Minneapolis pictures".

In continuation to my previous posting on event flow and decoupling, I would like to discuss the issue of static vs. dynamic event flows.

I already discussed the fact that event processing applications can be of many types, and naturally various types have their own properties.

There are applications whose nature is totally dynamic, such an application is information dissemination to alerts about customer's activities in banking systems. There are many subscribers that can subscribe to multiple type of alerts and change their subscriptions from time to time. In these type of application monitoring the event flow can be done for management purposes of the system, e.g. collection statistics about patterns of use, tracing individual flows for exception handling purposes etc.. However there is no sense of a global event processing network as there are many flow islands that are not related.

On the other hand there are event processing applications in which the flows are relatively static and there are a relatively stable set of event processing agents with relatively stable collection of relationships among them, actually, many of the event processing applications I have encountered are of this type. Example: an event processing application that manages am auction. The flow here is fixed as long as the auction protocol is not changing, thus the collection of event processing agents and their relationships are fixed. Of course, the run-time instances are still dynamic. This is similar to a database schema that may be relatively stable, and the data itself is dynamic.

The flow modeling is helpful for the:
  • "software engineering" aspect --- debugging, validation,analysis
  • performance aspect --- enable of scale-out by semantic partition, a topic we are working on and I'll discuss in detail in one of the future postings
  • management aspect --- provenance, tracing, monitoring
There are more questions that needs discussion about dynamic updates to event processing network, and I'll discuss them in the near future -- more later.

Wednesday, February 18, 2009

On Event Flow and Decoupling


This is a simulation of an anesthesia workstation, it can simulates various cases that creates a flow of events that are flowing inside this configuration, e.g. what happens when there is power failure.

I was recently asked if there is no contradiction between two concepts:
  • The decoupling concept, each event processing agent is independent, it subscribes to some event, publishes derived events, and is independent in any other agent, furthermore, it is decoupled and does not know anything about the other agents.
  • The Event Flow concept, in which there is explicit modeling of event flows
My answer is that there is not really a contradiction, since these two principles are in two different levels. The decoupling is in the execution level, event processing agents indeed do not need to communicate with one another, since there is no RPC or any other synchronous communication among them. The event flow concept exists in the modeling and management layers. In the modeling layer, there should be a view of the entire "event processing network" to enable that that the orchestra is playing together; in the management level, there should be a possibility to trace back the provenance of a certain decision or action, or trace forward the consequences of any event, however this still does not require to violate the decoupling in the execution layer, that's the beauty of model driven architecture... more - later.

Wednesday, September 17, 2008

On the Gartner EPS 2008


Early morning in Stamford. The "hype cycle" has become Gartner's most known artifact, as well as their constant flow of TLAs - SOA, BAM, EDA, RTE, XTP - are all Gartners' words. The Gartner EPS has ended last night, and in less than two hours we'll start the EPTS 4th event processing symposium (so Roy Schulte will be able to relax and I'll start sweating).
Some short impressions from the conference (besides the networking and meeting again old friends).
  • The Gartner analysts came with new slides, but very little new insights relative to their past messages.
  • Mani Chandy had interesting talk, saying that EDA is a natural continuation of SOA and not a paradigm shift (good topic to discuss). Also said that one of the big benefits of EP is - saving time for people by filtering and aggregation of flowing information.
  • David Luckham has talked about "holistic event processing" as the future -- which from technology point of view means - dynamic big event processing networks. Also talked about the need to have formally defined clear semantics of event processing language (well - that is what we are trying to do in EPDL).
  • Marc Adler has a great talk (in my opinion, the best one) - about his experience in developing CEP application, which seems to be a success story. He talked about criteria to select a vednor, difficulties in the development itself, and shortages in the state-of-the-art. One of his insights is that unlike what the "stream SQL" fans are saying that since people know SQL anyway, it is a good basis, he claims that while the syntax looks like SQL, it is a totally different type of thinking, and the knowledge of SQL does not help on getting it.
  • Richard Brown had a talk about collecting events from text - news, blogs etc... I think that getting events from unstructured data has a lot of potential, I also talked with somebody who told me about start-up that extracts events from video cameras - the scenario of tracing behavior of people that fit the shop-lifting pattern (was discussed recently in Blogs) - may become reality soon, it seems that the technology is getting there.

That is all my time permits me to write today -- more later.

Tuesday, September 2, 2008

On flow oriented and component oriented development of EP applications




I got an invitation for some company I have not heard about for a kickoff of a product in the area of "seating in front of computers", I don't have time to go - but since they have also attached a picture of their product, I have copied it - I wonder if this is the current trend in ergonomy...
Anyway, today, some thoughts following a recent discussions here about development tools for event processing applications.
There are two possible ways, from the developer's point of view, to build an event processing network.
  • The first, which I'll call "component oriented" (may be there is a better name?), in which the developer defines the different components (patterns, rules, queries - use your favorite language style), each of them individually, and then some compilation process build the network implicitly. This is a kind of "bottom up" approach.
  • The second, which I'll call "flow oriented" , in which the developer has some graphical representation of the flow, and when building the flow, one can put some boxes inside the flow, and zoom to each box to define the component. This is a kind of "top down" approach.

It seems that each of them has benefits for other assumptions - if the application is dynamic and mainly subscription based, then the first approach is probably better, since the notion of flow is not a stable one; if the application is relatively static, then there is a benefit to use the second approach, since it can provide more visibility into what the application as a whole is doing, and help in the validation, since the "decoupling" principle may bring to the developments' feeling of chaos (this is indeed one of the barriers of the use of event processing...). As said, the "flow oriented" approach can ease the validation of event processing application, there are also some tools that help validating "implicit" flow, but the validation issue deserves discussion by its own right. More - later.