Showing posts with label Roy Schulte. Show all posts
Showing posts with label Roy Schulte. Show all posts

Friday, June 21, 2013

Event processing platforms - reboot?

Doug Henschen, the editor of Information Week wrote a commentary entitled "big data reboots real-time analysis" .  Henschen says that event processing was in the height of its hype in 2008, but the economic crisis stopped the growth of this area.  He sees indications of "reboot" in the recent acquisitions of Apama by Software AG and Streambase by TIBCO, and attributes the reboot to the need of big data to evolve from its batch origins to detect patterns on moving data.  
As I have written before, the barriers to growth stem from some external factors (certainly the general financial situation), but also the over-hype of request-response or batch oriented analytics (see my post on Sethu Raman's keynote in DEBS 2012).  Another reason, as observed by Roy Schulte last year,   is that many enterprises developed in-house solutions.    I assume that Henschen is right in the sense that big data gives additional opportunities to event processing technology, and that the recent acquisitions will create waves of interest in the market.   As I have written before, the next frontier is not improving the technology, but making it accessible to the business users and convert the enterprises to think in an event-driven way.   Jeff Adkins and myself will discuss this issue in the coming DEBS'13 tutorial, on June 30.  More - later. 

Sunday, July 29, 2012

Event processing in practice


My old friend Roy Schulte from Gartner, who has been the first analyst that published analyst reports about event processing, and one of the founding fathers of the event processing community and EPTS has published a "personal blog" on the complexevents site with the provocative title:  " does anybody care about event processing?".   Roy expresses his opinion (that I've heard from him several times before) about the event processing concept and market. Roy's short answer to the question he asks is YES, but many of those who are doing event processing are not aware that they are doing event processing.   He mentions two main issues:
  • Terminology:  the term event is manifested in many different ways, while the community centered around the current products have narrow interpretation
  • Market:  Most of the event processing is custom code, or built into vertical products, and not the general purpose event processing platforms, thus he had to scale down his prediction for revenue that count the event processing platform market.
Roy suggests that EPTS will decouple itself from the focus on the product oriented functionality and concentrate on event processing as technology in the broad sense.   

I concur with Roy's observations (we have discussed it last year when I visited him in the Gartner office);  I have written before about the build-buy trade-off,  based on chapter 1 in the event processing manifesto, a chapter that Roy was one of its authors.      I also noted in some of the feedback to my book that I got from developers who told me that the book helped them a lot in architectural thinking and clarity of concepts, but they also said that they'll continue to develop event processing applications in custom code.  

Roy's Blog has been written in the context of the discussions on EPTS and online magazine,  and I have expressed my opinion in the same spirit as Roy --  the online magazine (and EPTS) should appeal to anybody that are doing anything with any kind of event in any form.  This should capture all the "event processing in practice" in the universe.  More news about the online magazine -- soon (the editorial board is now being created, which contains both old members of the community and new blood of practitioners).




Thursday, December 16, 2010

Where does the edBPM term come from?

Writing earlier today about a German friend,  another German friend, Rainer von Ammon has written on the complexevents forum about the source of the term edBPM,  Rainer is also the one who has drawn the nice illustration above showing the reference model, Rainer is the person promoting this term, and organized several workshops around the concept of pairing event processing and BPM technologies.     Rainer wrote that the term came from Gartner and quoted me.   This is almost true,  the original term that Gartner used is "event-based BPM"  and I have slightly modified it to "event-driven BPM" when I asked Rainer to write a value about it in the Database encyclopedia (I have been the editor of the event processing related terms).  
Here is the Gartner's original slide from 2005.  




This is the original slide,  source:  "Event-driven applications make event-driven businesses work better", a presentation by Roy Schulte from Gartner in 2005. 


As you see Gartner classified the event processing functions into: simple event processing, mediated event processing, event-based EPM, and complex event processing.    Roy Schulte has later realized that "event-based BPM" is orthogonal dimension to the three others, and changed the positioning.     So this is the source for all history lovers. 



Sunday, November 8, 2009

On the Event Processing book by Chandy and Schulte



Today I got a package of books from Amazon that included two new event processing related books. I'll review the first of them today. This is the book by Mani Chandy and Roy Schulte, called "event processing - designing IT systems for agile companies". The title itself (agile companies) indicates that the book is business related, and indeed it is primarily answers the questions: why use event processing, and how it is related to other concepts in enterprise architecture concepts. The book is non-technical and fits the level of managers/CIOs/ business analysts. The book starts with overview and business context of event processing, talks about business patterns of event processing (another type of patterns, besides all other types of event processing patterns), talks about costs and benefits of event-processing applications, and types of event processing applications. After doing the ROI part, it goes to more architectural discussion -- getting top-down approach: EDA, events, and employing the architecture. Next there are two chapters about positioning event processing against the rest of the universe: SOA, BPM, BAM, BI, rule engines (I'll write about this positioning attempts in later postings). Towards the end there is a chapter of advices how to handle event processing applications (and this chapter reads like analysts report). Last chapter talks about the future of event processing, again from business perspective, future applications, barriers and dangers (again a topic for which I should dedicate a complete discussion), and drivers for adoption.

In conclusion: good book to everybody who wants to know what event processing is and what is its business value. Things that I thought such a book might also include --- some reference to what currently exists in the industry, how the state-of-the-practice relates to these theoretical concepts presented in the book, when COTS event processing should be used vs. hard-coded, which are practical considerations of event processing applications
(maybe in the second edition?)

For those who asked me what is the relationships between the book Peter Niblett and myself are writing and this book, the answer is that our book has a totally different focus, explaining step-by-step, what is needed to build an event processing technology, providing the reader an opportunity to experience the various approaches in the state-of-the-practice by providing a free downloadable versions of various products and open source. The target population is also different - we aim for designers, architects, developers and CS students, while The book by Mani and Roy is aimed at managers, business analysts and MBA students. The review of the second related book - later.

Wednesday, September 3, 2008

On event processing as a paradigm shift


The readers are probably familiar with this picture where it shifts between seeing two faces facing each other (in black) and a white vase. I came across a (relatively) new blogger in this area, Pern Walker, blogging for Oracle's "event driven architecture". The title of the posting is:
Event servers, a disruptive technology. It describes the components of the (former) BEA framework, nothing new here, but the interesting part is the conclusion - event processing COTS is a disruptive technology, it displaces custom code in event processing, since it is more cost-effective.
This reminds me of a discussion we had in May 2007 in the Dagstuhl seminar on event processing, it was a night discussion with wine, and was lead by Roy Schulte, the question that Roy has posed to the participants : "Will Event Processing (EDA) become a paradigm shift in the next few years or not?”.
Today, I don't intend to answer this question, instead I'll post part of the discussion in Dagstuhl that included observations about "paradigm shifts" (thanks to my colleague, Peter Niblett, who documented the entire Dagstuhl seminar). I'll return to this topic again, with my (and maybe other) opinions about the answer, after the EPTS event processing symposium
Observations (from the Dagstuhl discussion):
  • Paradigm shifts can’t happen if there are too many barriers; have the entry barriers for "event processing" already been removed? ;
  • Paradigm shifts are more likely to happen when adopters decide they need a whole new avenue of applications; they are less likely to happen as a way of re-engineering existing systems. For example the German population will reach 1:2 old: young ratio by 2020 so this requires a paradigm shift of healthcare models. Can we identify new avenues of relevant applications?
  • Paradigm shifts usually happen as a result of some external change, not just because of innate strengths of the technology itself. Can we identify such external changes?
  • Standardization is not necessary for a paradigm shift, but good, appropriate standards (de facto or otherwise) certainly help

Another question is to where in essence is the "paradigm shift" - is it the decoupled "event-driven" paradigm ? is it the "complex event processing", i.e. ability to find patterns on multiple events? is it the entire processing framework as the Oracle's Blog claim?

More - Later

Tuesday, November 20, 2007

"The only motivation to use EP COTS is to cope with high performance requirements" - true or false ?



Somehow, I find myself using my collection of analysts' presentations to help me make some points, this time, I am showing a slide from Roy Schulte's presentation in the Gartner EPS summit - I'll return to the content of this slide shortly, but will start with some discussion I had yesterday about the reasons that enterprises are using COTS for event processing. I have heard (and not in the first time) the assertion - the only reason that one will want to use EP software and not hard-coded solution is the ability to cope with high throughput / low latency - in short, to deal with high performance requirements. If there are no high performance requirements there are other solutions, e.g. the database guys think that in this case one can insert all events to a database and use simple SQL queries for CEP patterns, or just using the good old C/Java programming for this purpose. This is somewhat inconsistent with my own experience, where customers that did not have "high performance" requirements were eager to use CEP technologies. Indeed, high performance is a reason to use CEP COTS, however, as indicated in Roy Schulte's slide above - it is actually a minor reason. According to Gartner, the high end is somewhere between 5-10 percent of the candidate applications, while looking at the prediction for 2012 - the use in the high end will be 8% - out of the 27% total use; note also that Roy Schulte defines the high end as 250 events per second, which is really far from the definition of "high performance", so the numbers are even lower. It seems that the market for "non high performance CEP" is much larger, and will grow faster. If that's so - where does the misconception that EP equals high performance always ? I think there are two sources - the first, the early adopters were from the capital markets industry, where some (not all !) of the candidate applications has indeed high performance characteristics. However, with the growth of the market, and use of EP software in other applications and other industries, these type of applications, while continue to grow, will not match the higher growth of non high performance applications. The other reason is that some vendors make the high performance as their main message, and trying to get the market believe that this is indeed the most important property.
So - if high performance is not the only reason to use EP COTS what are the other reasons to use EP COTS ? this is a matter for investigation, but IMHO the main one is the "high level programming" and agility - in short - the ability to reduce the Total Cost of Ownership.
I'll provide more insights about the TCO issue in a future post.