Wednesday, June 1, 2011
Sapeinza Universita Di Roma which hosted DEBS 2008 in a hot summer week is going to host another event processing related workshop, attached to DISC 2011 (The intenational symposium on distributed computing)..
The focus is theoretical aspects of event-based systems.
Today, while trying to update a presentation, I found out that Powerpoint on my laptop somehow became corrupted, I pinged our friendly system guys, who took my laptop for 2.5 hours, and returned it after reinstalling MS-Office in worse shape (now it does not recognize Excell, and has also problems in Word), so tomorrow morning on the way to the dentist, I'll drop the laptop again with the friendly system guys, hoping they'll do better that time. In some of the spare time, I had meeting, but it part of it I realized that everything I do somehow involves this laptop, so I decided to catch up on a book I started to read long time ago and never finished- Clayton Christensen's book on the innovator's dilemma. This is a classic book, published in 1997 that somebody recommended me to read several years ago.
My of my favorite topics is the topic of disruptive technologies. Christensen provides several reasons why it is difficult for big companies to create disruptive technologies, though they are those who damage the most.
Some of the reasons are:
- Companies depend on customers for resources: disruptive technologies are developed not as part of customer requirement (this reminds me that somebody in IBM said 6-7 years ago about event processing that this is hallucination of research folks.
- Small markets don't solve the growth need of large companies: since there is investors/analysts pressure on each company to grow steadily, from short term growth considerations, disruptive technologies are not attractive to the organization, not growing it in the short run.
- Markets that don't exist cannot be analyzed - when investment require analyst support and hard-core figures, they may not exist
- An organization's capabilities define its disabilities: processes that fit the "business as usual" work, may not be appropriate for developing disruptive technologies
- Technology supply may not equal market demands: companies often don't identify the gaps in their own products, or trying to improve them incrementally, thus expose themselves to disruptive technologies that close these gaps
These five principles are the introduction, the rest of the book deals with methodology for organization to be able to create disruptive technologies.
Tuesday, May 31, 2011
With the help of the DBWORLD mailing list, I have learned about a planned workshop in ISWC' 11 (International Semantic Web Conference) that deals with Detection, Representation, and exploitation of Events in the Semantic Web. I think that having an event processing related workshop in conjunction with other communities is a good idea, there is one that runs along the BPM conference, and there have been in the past workshops/tutorials/sessions within various other conferences as well.
This specific workshop posed several questions for the submitters, and intends to hold a "data" challenge - given a dataset of events - show extensions, demos or interesting applications. The events will be in areas of music and entertainment.
Again - good idea, I hope it will be successful event, and making it "free publicity" here.
There is one concern, and this is a repeating phenomenon in the academic community: people are doing things that were done before in slightly different manner within another community, sometimes using different terminology, since the two communities are not communicating, sometimes at least one of them is not aware of the other. Looking at the list of organizers - none of them looked familiar to me, not surprising, since I have never dealt with semantic web issues, if they organize such a workshop they probably do something in this area that I am not familiar with, so maybe I am missing something. On the other hand, they mention that there is work around event modeling in other communities such as: information retrieval or multimedia, but they might be unaware that there is an active event processing research community, and they might also be missing something that was done before...
Monday, May 30, 2011
Titles and abstracts of the four keynote speakers in DEBS 2011 are now available, they will be updated on the DEBS website a little bit later, after the DEBS webmaster returns from vacation. All of them seem to be interesting talks.
Speaker: Chris Bird.
Chris is the solution architect for travel and leisure at Progress Software, until recently he has been the Chief Architect of Sabre Airline Solution.
Title: Avoiding he said/she said arguments in distributed event handling systems
Abstract: Processing critical event stream models, whether they be in chemical plants, in power stations, in process environments, in aircraft operations all have one thing in common. You want to know that the information got to where it was supposed to go, and you want to know that proper action was taken. Innocent sounding phrases, but with surprising complexity. Expand that thinking to conceptual business events and the complexity multiplies. In process control environments, for example, the payloads are typically small (10s to 100s of bytes), often in the form “Device id, Timestamp, Value1[,Valuen], so transmitting that data over a network, e.g. SCADA can be quite network efficient.However Business Events tend to be less frequent and will often have significant associated data destined for several “downstream” handlers. Taking an example from the airline industry, the event of rescheduling a flight has significant repercussions in Passenger Bookings, Airport Slot Management, Revenue Management, Equipment Scheduling, Catering, Fuel Fleet Scheduling, Crew Scheduling, together with knock on effects to other parts of the airline’s schedule as equipment is anticipated not to be available when planned. Add to that the extra complexity of having many other “businesses” involved. A late incoming flight may have effects on car rentals, hotel reservations, dinner reservations, etc. So the single event (a reschedule) has enormous ramifications. Along with these business ramifications, there are also legal ramifications. When a charge is to be disputed, (e.g. a hotel charging the first night fee because of a traveler no show) it is extremely important to know whether the hotel was notified prior to the cutoff time. The session introduces the ideas behind decoupling and situational awareness, so that efficient event handling takes place, together with appropriate situational awareness to enable policy decisions to be enacted.
Speaker: Don Ferguson
Don is Chief Technology Officer of CA.
Abstract: IT system and application management is critical to business use of IT systems. Distributed event processing is core to application and systems management, even for applications that are not "event driven." Emerging technology like virtualization and cloud computing significantly increase the central role of distributed event processing. IT systems and applications management introduces major challenges and requirements not typically seen in application centric event processing. This presentation provides an overview of IT system and application management use of distributed event processing, and the evolution for cloud computing. The presentation then provides an overview
of current solutions and technology to the requirements. Finally, there will be a discussion of open issues and research challenges.
Speaker: Johannes Gehrke
Johannes is a professor of computer science in Cornell University.
Title: Declarative Data-Driven Coordination
Abstract: There are many applications that require users to coordinate and communicate. Friends want to coordinate travel plans, students want to jointly enroll in the same set of courses,
and busy professionals want to coordinate their schedules. These tasks are difficult to program using existing abstractions provided by database systems because in addition to the traditional ACID properties provided by the system they all require some type of coordination between users. This
is fundamentally incompatible with isolation in the classical ACID properties of transactions.
In this talk, I will argue that it is time for the database and event processing communities to look beyond isolation towards principled and elegant abstractions that allow for communication and coordination between some notion of (suitably generalized) transactions. This new area of declarative data-driven coordination (D3C) is motivated by many novel applications and is full of challenging research problems. I will start by surveying existing abstractions in database systems and explain why they are insufficient for D3C. I will then describe entangled queries, a coordination language that extends SQL by constraints that allow for the coordinated choice of result tuples across queries originating from different users or applications, and I will discuss algorithms for evaluating entangled queries. I will conclude with a set of research challenges for event processing in this new area.
Speaker: Calton Pu
Calton is Professor and John P. Imlay, Jr. Chair in Software in the College of Computing,
Georgia Institute of Technology.
Title: A World of Opportunities: CPS, IOT, and Beyond
Abstract: The continuous evolution of computing and networking technologies (e.g., Moore’s Law) is creating a new world populated by many sensors on physical and social environments. This emerging new world goes much further than the original visions of ubiquitous computing and World Wide Web. Aspects of this new world have received various names such as Cyber Physical Systems (CPS) and Internet of Things (IOT). CPS links many physical sensor data to detailed simulation models running on large data centers. IOT brings together many appliances, making much more environmental data available and supporting control of these appliances. CPS/IOT applications are many, including personalized healthcare, intelligent transportation, smart grid, sustainable environment, and disaster recovery as representative examples. These CPS/IOT applications are motivated and strongly pushed by significant new social, economic, and human benefits. At the same time, these applications are also mission-critical with serious quality of service requirements such as real-time performance, continuous availability, high security and privacy. We will argue that the traditional process-oriented programming languages and software architectures should be augmented by distributed event-based facilities and abstractions for the construction of large scale distributed CPS/IOT applications. In addition to the focus on performance, we anticipate that other quality of service dimensions such as availability, reliability, security, and privacy will become important concerns in the research on distributed event-based systems. We will discuss research opportunities and challenges that bring the distributed event based systems technology to CPS/IOT applications.
Sunday, May 29, 2011
There are several works that attempt to use social networks to detect human sentiments. In a recent article, John Bates (Progress Software CTO and one of the founding fathers of the event processing discipline and community) reacts to some claims that using Twitter one can predicts trends in the stock market with high accuracy. While the article deals with stock market applications, I'll give some comments within a broader context of detecting human sentiments in any domain. I'll quote John's claims in italics and different color and mine in this font and color.
John states three problems with that assumption:
- Social networks are not secure, thus the results may not be accurate, and may also be malicious or just wrong. His example is that somebody twits about the fact that USA started a war on France. If somebody uses it to effect the stock investment, or even travel plans, the results may not be that good. I agree that this is a major issue.
- Social networks are unlikely to contribute anything that is not being disclosed in mainstream news. I am not sure that I totally agree with this one, sometimes individuals on twitter or facebook reveal news before the mainstream media detect it, we saw some examples recently all over the world.
- The time it takes to process the sentiments may make the results obsolete, since the reality may move faster. I guess that it might be true in some cases, but with real-time analytics, it may be good enough in other cases.
Overall, the first problem stated is the most serious one. Perhaps when there will be some validation mechanism to social networks postings, then it can be more reliable. One may claim that if we can assume that most posts on social networks are correct, then there can be noise removal filter. But still social networks may not be reliable sources (BTW - mainstream media news may not be accurate either!).
It should be noted that I've learned about John's article since he twitted about it on Twitter. So maybe the fact he wrote this article by itself is not accurate? More on this -later.