Brian Connell has asked me to provide an example in which Complex Event Processing can be used when low latency is not required and it is still be better than other solutions.
I have already referred to this issue in the past. There are several concrete examples that come to mind - here is one (real system):
An enterprise has internal regulations to handle suppliers, these requirements relate to the interaction with suppliers, that may be represented as events, and involve actions that must be done, or should not be done, within certain amount of time, or until some event happens. The application is to monitor the compliance with these internal regulation in audit mode, meaning the results are alerts (which are also treated as events, since there are time-outs from alert sending) and not direct interference in the business processes.
The throughput is far below what is considered as "high throughput" and it is several thousands events per day. The latency is also not required to be extremely low -- if the alert will be issued within a minute or two, it is still very fast auditing. It also does not require any analytics of intelligent procedures, since the regulations are given and deterministic
What are the benefits for the customer to use EP software and not any other solution?
First - some of the regulations are fairly complex - writing them in hard-code can be time and resource consuming, putting everything on databases and using SQL was also considered - but some of the regulations are not easy to express in SQL.
Second - Regulations tend to change frequently, the users wish to control these changes, and getting them through the slow IT development cycle will delay the introduction of the change.
So in this case the customer's motivation are - agility and de-complexity; more later