Showing posts with label ROI. Show all posts
Showing posts with label ROI. Show all posts

Sunday, October 7, 2012

On big data, small things and events that matter

In a recent post in the Harvard Business Review Blog entitled: "Big Data Doesn't Work if You Ignore the Small Things that Matter" ,  Robert Plant argues that in some cases organization invest a lot in "big data" projects trying to get insights around their strategy, while failing to notice the small things, like customers leaving due to bad service.   Indeed big data and analytics are now fashionable and somewhat over-hyped.  There is also some belief, fueled by the buzz that it solves all the problems of the universe, as argued by Sethu Raman in his DEBS'12 keynote address.   Events are playing both in the big data game, but also in the small data game, trying to observe a current happening, such as time-out on service, long queues etc..., when it relates to service, and other phenomena in other domains.  Sometimes the small things are the most critical.
I'll write more about big data and statistical reasoning in a subsequent post.

Sunday, November 8, 2009

On the Event Processing book by Chandy and Schulte



Today I got a package of books from Amazon that included two new event processing related books. I'll review the first of them today. This is the book by Mani Chandy and Roy Schulte, called "event processing - designing IT systems for agile companies". The title itself (agile companies) indicates that the book is business related, and indeed it is primarily answers the questions: why use event processing, and how it is related to other concepts in enterprise architecture concepts. The book is non-technical and fits the level of managers/CIOs/ business analysts. The book starts with overview and business context of event processing, talks about business patterns of event processing (another type of patterns, besides all other types of event processing patterns), talks about costs and benefits of event-processing applications, and types of event processing applications. After doing the ROI part, it goes to more architectural discussion -- getting top-down approach: EDA, events, and employing the architecture. Next there are two chapters about positioning event processing against the rest of the universe: SOA, BPM, BAM, BI, rule engines (I'll write about this positioning attempts in later postings). Towards the end there is a chapter of advices how to handle event processing applications (and this chapter reads like analysts report). Last chapter talks about the future of event processing, again from business perspective, future applications, barriers and dangers (again a topic for which I should dedicate a complete discussion), and drivers for adoption.

In conclusion: good book to everybody who wants to know what event processing is and what is its business value. Things that I thought such a book might also include --- some reference to what currently exists in the industry, how the state-of-the-practice relates to these theoretical concepts presented in the book, when COTS event processing should be used vs. hard-coded, which are practical considerations of event processing applications
(maybe in the second edition?)

For those who asked me what is the relationships between the book Peter Niblett and myself are writing and this book, the answer is that our book has a totally different focus, explaining step-by-step, what is needed to build an event processing technology, providing the reader an opportunity to experience the various approaches in the state-of-the-practice by providing a free downloadable versions of various products and open source. The target population is also different - we aim for designers, architects, developers and CS students, while The book by Mani and Roy is aimed at managers, business analysts and MBA students. The review of the second related book - later.

Sunday, April 5, 2009

On the "Return on Investment" in Event Processing



As part of the orders that I got from my physician, to which I am humbly comply, is to spend around one hour walking every day, if I have time I am doing it outside, if not I am doing it at home on an electric walker. Yesterday I walked around, not far from home, and decided to take a shortcut via a woodland, not far from home, but not a familiar one, I saw a trail that seemed to go up in the hill, where the top of the hill was supposed to lead me back to somewhere in my neighborhood, there was a split in the trail and I chose a random one, and after five minutes realized that it leads to a dead end, however, I did not feel like going down, so I continued to climb up and pave the way among bushes, fallen trees etc -- quite irresponsible of me, especially as it was getting dark outside -- after 40 minutes of wondering I saw the back yard of an house, navigated there, and got safe and sound indeed to somewhere in the neighborhood. Somehow felt as return to childhood -- but not for long.

Anyway, today I would like to write something about the "Return on Investment" in event processing.


Mark Palmer, the current CEO of Streambase, has recently blogged about the fact that CEP is not about "feeds and speed" but about "ease of use", it is actually refreshing to see it from a Streambase person, since in the past some Streambase people claimed that the only reason to use a CEP engine is due to its scalability properties. Actually I have written one of my first postings on this Blog, entitled "the mythical event per second" saying something about it. I agree that there are some applications that require to satisfy high throughput or any other QOS metrics as a crucial requirement, but this is a secondary ROI type. The major one is providing abstractions that reduces the cost of the development and consequently the maintenance of event-driven applications. This is similar to what the DBMS discipline provides us -- as a grey-bearded old timer who is not completely senile, I still remember the times we have worked with file systems, DBMS provided many abstractions that makes the data oriented applications much easier to develop. The same goes for event processing, I am constantly saying to people who ask -- is there something new in event processing ? the answer is -- not really, event processing were hard-coded within regular programming for ages, however, since traditional programming languages and environments were not created to process events, the manual work required is quite substantial. The reduction in cost relative to hard coding can be substantial, and some customers have estimated it in 75% reduction. It will be interesting to do an empirical study about it, probably a challenge for our EPTS use case work group. More about ROI -- at later posts.

Sunday, February 1, 2009

On Off-Line Event Processing



A comment made by Hans Glide to one of my previous postings on this Blog, prompted me to dedicate today's posting to Off-Line Event Processing. Well - as a person who is constantly off any line, I feel at home here...

Anyway -- some people may wonder and think that the title above is an Oxymoron, since they put "real-time" as part of the definition of event processing. I have used before this picture that is the best describing some of what is written about event processing - by everybody:



This, of course, illustrates a collection of blind people touching an elephant; each of them will describe the elephant quite differently, and the phenomenon that people say "event processing is only X", where X defines a subset of the area is quite common. In our case X = "on line".

The best here is to tell you about a concrete example of a customer's application I am somewhat familiar with. The customer is a pharmaceutical company which monitors its suppliers related activities. It looks at events related to supplier-related activities and checks them against its internal regulations. The amount of such events are several thousands per day and from business point of view, it does not require real-time requirements, the observation about any regulation violation and action taken, can be done in the next day. The way that this system works is accumulate events during the day, and activate the vent processing system at the end of each day, which is actually a batch processing done off-line.

An interesting question is why have this customer chosen to use an event processing system, and did not use a more traditional approach of putting everything in a database and using SQL queries. The answer is quite simple: This applications have some interesting properties:
  • The number of regulations is relatively high (in the higher range of three digits);
  • Many of the regulations rules are indeed detection of temporal oriented patterns that include multiple events,
  • Regulations are inserted or modified frequently.
Given all these it turned out that the use of event processing system in off-line was the most cost-effective solution; While using SQL is nominally possible, writing these regulations in SQL is not easy, and the magnitude makes the investment in development and maintenance quite high.

So - the benefit of using event processing here is neither the real-time aspect, nor high throughput support, but simple TCO considerations.

This is not the only applications of this type, and in fact, I have seen several other cases in which event processing has been used off-line. There is also another branch of off-line processing which combine on-line and off-line together, but I'll write about it in another posting...

More - Later.

Sunday, June 29, 2008

On cost-effective EP when high performance is not required


I have already referred to this issue in the past. There are several concrete examples that come to mind - here is one (real system):
An enterprise has internal regulations to handle suppliers, these requirements relate to the interaction with suppliers, that may be represented as events, and involve actions that must be done, or should not be done, within certain amount of time, or until some event happens. The application is to monitor the compliance with these internal regulation in audit mode, meaning the results are alerts (which are also treated as events, since there are time-outs from alert sending) and not direct interference in the business processes.
The throughput is far below what is considered as "high throughput" and it is several thousands events per day. The latency is also not required to be extremely low -- if the alert will be issued within a minute or two, it is still very fast auditing. It also does not require any analytics of intelligent procedures, since the regulations are given and deterministic
What are the benefits for the customer to use EP software and not any other solution?
First - some of the regulations are fairly complex - writing them in hard-code can be time and resource consuming, putting everything on databases and using SQL was also considered - but some of the regulations are not easy to express in SQL.
Second - Regulations tend to change frequently, the users wish to control these changes, and getting them through the slow IT development cycle will delay the introduction of the change.
So in this case the customer's motivation are - agility and de-complexity; more later

Tuesday, February 12, 2008

Hype vs. value -- a constructive view


Besides free advertising to an e-book, the illustration above - together with the title indicates that I am going to write something constructive - in the previous posting on "bitter pills" I have provoked the evil spirits and got sick with a flu, so with some almost sleepless nights, I have been too tired to Blog -- today is much better - so back in business. There are several people complaining about -- we need to get from hype to more impact on business, something is lacking today etc --- all true ! I also said that we need to take a constructive view - not only complain, but see what should be done. So here is an initial attempt to do it, actually nothing new here, just listing some observations done by various people:
  1. More packaged applications based on EP technology should be constructed -- the model of using an EP tool as an application development tool covers only small part of the potential market, since, in the end of the day, business executives are interested in applications and not in technology.
  2. Enable business users (non-IT developers) to control the behavior (e.g. define, modify, compose patterns/rules), since agility is an important expectation of the customers
  3. Learn how to articulate the business vale -- yes we all know "threats and opportunities" but this is somewhat too abstract for the decision makers -- the business value is also not unique, there are various types of applications with different business values, and explaining the right one is crucial - more thoughts about the various types of business values -- in one of the next postings.
  4. Advance on standards -- customers don't like proprietary.

Of course - this is only the initial list, not even talking about advances in technology, to start the discussion - more later.