Tuesday, January 21, 2014

Some simplification goals in the design of the event model

I have written in this Blog about our work on "The Event Model" which is based on the search for simplification in event-based modeling.   Here are some of the simplification goals that we strive to achieve while designing the model.   These are a high level goals.  

1. Stick to the basics by eliminating technical details.    Looking at designs and implementations of event-driven applications, one can observe that there are two types of logic: the business logic, which directly states how derived events are generated and how the values of their attributes are assigned, and supporting logic that is intended to enrich events, or query databases as part of the processing.
2. Employ top down, goal oriented design.    Many of the design tools require logic completeness (such as referential integrity) at all times.  This entails the need to build the model in a bottom up fashion, namely all the meta-data elements are required to be defined (events, attributes, data elements) prior to referring to them in the logic definition.   Our second simplification design goal is to support top down design, and allow temporary inconsistency working in the “forgive” mode  in which some details may be completed at a later phase.  This design goal complements the “stick to the basics” goal, by concentrating on the business logic first, and completing the data aspects later.
3. Reduce the quantity of logic artifacts.  In typical event processing application, there may be multiple logic artifacts (event processing agents, queries, or processing elements depending on the programming model) that stand for different circumstances in which a single derived event is being derived.  Our design goal is to have a single logic artifact for every derived event that accumulates all circumstances in which this derived event is generated.  This goal reduces the number of logic artifacts and makes it bounded by the quantity of derived events.  It also eases the verifiability of the system, since possible logical contradictions are resolved by the semantics of this single logic artifact.
4. Use fact types as first class citizens in the model.  In many of the models, terms in the user’s terminology are modeled as attributes that are subordinates of entities or relationships.  In some cases it is more intuitive to view these concepts as “fact types” and make them first class citizen of the model, where the entity or event they are associated with is secondary (and may be a matter of implementation decisions).  This is again consistent with the “stick to the basic” goal. 

These goals are high level.  I'll write more details in the future about the ways we chose to satisfy each of these goals, and discuss alternatives for doing that.  I guess that over time we'll accumulate more simplification goals. 

Sunday, January 12, 2014

On the future of wearable technology


With the start of 2014, we are in the era of predictions on "the future of X".  I came across report and presentation on the future of wearable technology both by PSFK labs and Intel. 
It partitions the wearable technologies to three types:

Connected intimacy: person-to-person -  tracking devices by persons on other persons, monitoring babies, hug simulation jacket (where a parent can simulate hug to a child from remote) and more..

Tailored ecosystem: person-to-computer - such as bracelet that send hot or cold thermoelectric pulses 

co-evolved possibilities:  person-as-computer:  heartbeat signature as password, operating gadgets by blinking using smart make-up, and computers embedded inside persons.

It will take several years to make them main stream, but wearable technologies are the sensors and actuators of the future. 


Friday, January 10, 2014

Kurzweil's predictions about how the world will change

Thanks to Rainer von Ammon, I came across a blog post that brings Ray Kurzweil's predictions for the future.   Some highlights:

In 2017 we'll have self driving cars
In 2020 we'll print our designer clothes at home
In 2033 we'll get all of our energy from the sun
In 2040 those who us who will be still alive will stay young forever (Kurzweil's ultimate goals).

Some of us may be able to watch whether all these predictions come true... 

Tuesday, December 31, 2013

Summary of 2013 as reflected in this Blog

2013 is phasing out and this is a time to do a short summary via this Blog.

This year I have been abroad   times:  Two of them were vacation  - long one in New Zealand  (pictures on FaceBook) and very short one in Paris.  I have also  been twice in the USA (January and July), and attended DEBS 2013 in Arlington, Texas.   once in Luxembourg (for negotiation of 2 EU projects), once in Brussels (as reviewer for another EU project), once in Barcelona (for the ACM multimedia conference), and once in Hong Kong (for ER 2013).

The main activity in 2013 was around the event model, I explained some of the background early in the year, presented it first in ER 2013, and towards the end of the year we also produced YouTube  video clip.
We achieved a great progress in this front, and will see which shape and direction it will take. 

The most read post on this Blog this year was the post on comparison between S4 and Storm.  Some other well read posts were: causality vs. correlation  Web serviced triggered by SAP ESP,  Is Philosophy dead?  (an "off topic" post, but I have an academic degree in Philosophy in my record...). and "Event Model - what comes first, the logic model or data model?"

What is coming for me in 2014?  --   stay tuned!

One more thing:   This year I 'gained' a Wikipedia entry about me  (some revisions are needed, but according to Wikipedia rules, I am not allowed to update it)..  

Happy New Year. 


Wednesday, December 25, 2013

58 sensor applications


I came across a site that lists "top 50 sensor applications for the smarter world".  It actually list 58 applications partitioned to the following areas::  smart cities, smart environment, smart water, smart metering, security & emergencies, retail, logistics, industrial control, smart agriculture, smart animal farming, domestic & home automation, and eHealth.  

I worth digging into the different areas to check the potential applications, and the role of event processing in each of them. 

Friday, December 20, 2013

On reversing the roles - the kid got it!

Earlier this week I spent 2 hours with a group of high school students that were selected to be part of the "President of Israel program to discover and cultivate the inventors and scientists of the future".   The IBM Haifa Research Lab took part in this program, by conducting a sequence of sessions, each with one of   our local scientists (I was the last in the sequence).  
It was a very interactive session, and as part of it I described four scenarios of event processing in different areas (typical examples I am using in my talks:  the car theft example, the intensive care unit scenario, the never-lost luggage scenario, and the cold chain scenario that we use in our recent TEM video clip.  
I have asked them what they think is the common denominator among all these scenarios -- they said many right things, but one kid said the most important thing:   "in these scenarios the roles are reversed, instead of the usual way that the person tells a computer what to do, here the computer tells a person what to do".
This kid will  definitely have a bright future... 

Gartner's recent predictions about business intelligence and analytics



A recent set of predictions by Gartner, state that Business Intelligence and Analytics will remain top focus for CIOs through 2017. 

It mentions two interesting observations:   One is that the confusion in the market about the term "big data" and its tangible results constrain the spending and limit the growth of BI and analytics software.   The second observation is that By 2017, more than 50 percent of analytics implementations will make use of event data streams generated from instrumented instrumented machines, applications and/or individuals.  

This is consistent with the Gartner's term -  "two tier analytics",  where event processing is the second tier, after historical data analytics.  While the need to consolidate analytics and event processing is becoming more pervasive, the utilization barrier and the need to battle complexity is still a common denominator.