Friday, August 30, 2013

New market research on the event processing market by Markets&Markets
















It seems that there is a new comprehensive market research on the event processing market in the years 2013-2018 by Markets and Markets.  I don't have the market research itself (it is quite expensive), but the site gives some details, according to the report,  Markets&Markets forecast that  the "CEP  market"  is expected to grow from $764.5 million in 2013 to $3,322.0 million in 2018.  I wonder what this figures represent, it seems that this is beyond the accumulative sales of event processing platforms. 

They also classify the market according to the following verticals: 

BFSI: algorithmic trading, electronic transaction monitoring, dynamic pretrade analytics, data enrichment, fraud detection, governance, risk and compliance (GRC); 

Transportation and Logistics: asset management and predictive scheduling and toll system management; healthcare: self-service proactive monitoring and alerting and governance, risk and compliance (GRC); 

Telecommunication: mobile billboards, revenue assurance, network infrastructure monitoring and predictive CDR assessment; 

Retail: inventory optimization, shoplifting detection and real-time marketing and customer engagement; 

Energy and utilities: oil and gas operation management and nuclear crisis and smart grid energy management; 

Manufacturing: shop floor automation and operational failure detection, infrastructure management and supply chain optimization; 

Government, defense and aerospace: Intelligence and Security, emergency response services and geo-fencing and geospatial analysis; 

Others: includes education and research

Hope to get more insight towards this research. 


Friday, August 23, 2013

On concept computing - take one

We think in concepts.  We study concepts, we reason about concepts.   
Now we also have "concept computing", the term was coined by Mills Davis.  It does not appear in Wikipedia yet, but it is an interesting and useful idea.  Mills Davis uploaded his AAAI keynote talk on Slideshare recently, and the slides below is taken from there.   The work we are doing now is somewhat the projection of this idea for the event-driven world.  I'll write about it in the future.  Meanwhile -- this presentation is recommended 

Tuesday, August 20, 2013

Big data analytics will never replace creative thought


The claim expressed in the title of this posting is the title of  a piece in "Data Quality News" by Richard Jones.   It claims that the "data craze" - the conception that data mining alone is sufficient to get decisions in all areas, is a misconception in some areas.  Jones provides two examples:  marketing - where statistical reasoning gives a great value, but it deals with the small details, however  human creative thinking deals with the big picture, and data mining alone cannot get it,  and healthcare - again, data mining can be of great value, but interactions with the patient and personal examination by a physician is vital.    
I guess that the research into AI should also deal with how to create artificially creative thinking.  As I've written before Noam Chomsky has criticized the AI community by making statistical reasoning its mainstream and deserted the strive for  "solid model of the universe" .  I guess that after some disillusionment from the "data craze" the industry will settle on getting data mining its right place, as a supporting technology.

More on this - later.

Thursday, August 15, 2013

On machine learning as means for decision velocity

Chris Taylor has written in the HBR Blog a piece that advocates the idea that machine learning should be used to handle the main issue of big data - decision velocity.  I have written recently on decision latency, which according to some opinions - real-time analytics will be the next generation of what big data is about.
Chris' thesis is that the amount of data is substantially increasing with the Internet of Things, and thus one cannot get a decision manually in viewing all relevant data,  there will also not be enough data scientists to look at the data.   Machine learning which is goal oriented and not hypothesis asserting oriented will take this role.     I agree that machine learning will take a role in the solution, but here are some comments about the details:

Currently machine learning is off-line technology, case sensitive, and cannot be the sole source for decisions.


It is off-line technology, systems have to be trained, and typically it looks at historical data in perspective and learns trends and patterns using statistical reasoning methods.  There are cases of applying continuous learning, which again done mostly off-line, but is incrementally updated on-line.    When a pattern is learned it needs to be detected in real-time on streaming data, and here technology like event processing is quite useful, since what it does is indeed detect that predefined patterns occur on streaming data.  These predefined patterns can be achieved by machine learning.    The main challenge will be the online learning -- when the patterns need change, how fast this can be done in learning techniques.  There are some attempts at real-time machine learning (see presentation about Tumra as an example), but it is not a mature technology yet.

Case sensitive means that there is no one-size-fits-all solution for machine learning, and for each case the models have to be established in a very specific way for that case.  Thus, the shortage in data scientists will be replaced by shortage of statisticians,  there are not enough skills around to build all these systems, thus the state of the art need to be improved to make the machine learning process itself more automated.

Last but not least - I have written before that get decisions merely based on history is like driving a car by looking at the rear mirror.  Conclusion from historical knowledge should be combined with human knowledge and experience sometimes over incomplete or uncertain information.  Thus besides the patterns discovered by machine learning, a human expert may also insert additional patterns that should be considered, or modify the machine learning introduced patterns.




Tuesday, August 13, 2013

On event-driven, request-driven,stateful and stateless



This slide is taken from our DEBS 2013 tutorial, explaining what are the differences in thinking between the traditional request-driven way and the event-driven way.   It shows the differences by answering three questions.     This goes back to the differences between business rules and event processing, and old topic, on which I have written first time around 6 years ago!    One of the claims that I've heard several times is that the distinction between them is that business rules are stateless and event processing is stateful.     I think that the main difference is that business rules are treated as request driven,  the rule is activated on request and provides a response, while event driven logic is driven by event occurrence as shown in the slide above.

While it is true that there is correlation between event based/request based and stateful/stateless,  these are really orthogonal issues.

Event-driven logic can be stateless.  If we only wish to filter an event and trigger some action,  this can be stateless (most filters are indeed stateless), but it has all the characteristics of event-driven, including the fact that if the event is filtered out - no response is given.   

On the other hand -- a request-driven logic may be stateful, there are many instances of session oriented and other stateful request-response protocols.    One can also implement stateful rule engine in a request-response way, where invocation of rule is based on result of previous rules that are retained by the system.  

Bottom line:  stateful vs. stateless is not equivalent to event-driven vs. request-driven.   


Tuesday, August 6, 2013

On the role of chief evangelist

I thought this is an interesting title and wondered what is behind this title. Today Theo Pristley published a Blog post with his interpretation of the term.  According to Pristley the three main point of this role are (according to my own interpretation):  telling the story, connecting the dots and forming influential opinions. 
There has been some posts before on technology evangelism, for example, the one from which I copied the above picture,  saying that  "Evangelist is born to learn, speak, share, sell and inspire the masses towards the technology or product they are passionate about".

 I always believed that it is important to a person to work on things that they are passionate about. Early in my career I have taken some management course in which the instructor said: "if you are not getting up every morning being enthusiastic about what you are doing - you are not in the right place, do something else!".  
I realized that this assertion is true for me, and I went to do something else (study for Ph.D) - true story!

Some people have used the term about me claiming that I am evangelizing "event processing" (although, unlike the official evangelists nobody ever paid me to do it). While I find it difficult to identify myself with a term associated with religion, I think that I do preach event processing for various aspects for years in different ways (of course, I am not the only person doing it).  

Actually, in the last slide of the tutorial that Jeff Adkins and myself delivered in DEBS 2013,  we put the following statement:

BTW - Jeff has on his business card the title "connecting the dots", which is kind of part of the evangelist role according to Prisetley.    I guess that this is different from the role of chief evangelist in a software vendor level that is looking across the vendor's portfolio.  I generally believe that people with bright eyes are happier and working better - and that people who can inspire people to brighten their eyes are great asset.believe that people with 

Saturday, August 3, 2013

New name and face-lifting to David Luckham's site: complex event processing & real time intelligence


In this picture, taken seven years go, you can see me (somewhat heavier than I am today) with David Luckham (in the middle) and Roy Schulte (in the right hand side).  I have talked with David Luckham recently and found out that he is still maintaining his website, and now also face-lifting it.   On the website he also explains why he has changed the name.   According to Luckham, the name "complex event processing" is strongly identified with the event processing platforms, dedicated software to do event processing,  but this is a tiny fraction of the market, where the bigger market is embedded event processing inside other software.  This observation has been done before by various people (including myself), so I believe he is right. Luckham calls the bigger game as "real time intelligence", some people call it "real time analytics", but  there is an agreement that it is part of the big data game (and other games) and that event processing is in its backbone.   It will be interesting to see if such branding will catch on, and how exactly "real time intelligence/analytics" will be defined -- I'll write more about it.