The Engine for Event-Driven SOA


Don't miss the first ever virtual event on what promises to be the next leading edge of technology at ebizQ's Event Processing virtual conference. Sign up here!
Over the past few years, the IT community has recognized the benefits of Service-Oriented Architectures, wholeheartedly embracing the move away from monolithic applications to applications built from systems of loosely coupled services. Now we're on the verge of a new era for SOA, with the integration of Complex Event Processing (CEP) technology.

Complex event processing takes SOA to a new level through the introduction of decoupled services - a significant step beyond loosely coupled services. CEP enables the gathering of data from and about any services running in the enterprise. It also allows business logic to be applied to that data in order to derive insight and enable appropriate real-time response to changing conditions. In the context of SOA, the power of the event-driven model is that it allows greater flexibility since services are entirely decoupled, unaware of who is producing the events they operate on or consuming the events they produce. It also allows for better insight into current conditions and the ability to instantly respond as events occur.



So how does this work?

In a "traditional" service oriented architecture based on a request-response paradigm, one has many distributed components, most of them providing a service to other components. These are continually on "stand-by," waiting for a service request which contains additional data. The service processes the data and returns the result to the requesting application. This approach has proven far superior to monolithic applications for most business needs by allowing for reuse along with enabling business agility - an individual service can be used by multiple applications, and a service can be modified without affecting any other services.

For all its strengths, there are still significant limitations. SOA requires applications to know what services are available and how to interact with them. It also means that nothing happens unless or until an application invokes a service request. With the request-response paradigm, every service must know which other services need to be informed about what has taken place. One of the implications of this is that in order to add a new function, existing services have to be modified.

Enter the "event" in complex event processing. An event is something that happens: a transaction, system occurrence, a stock trade, a request for a web page - the possibilities in today's enterprise are endless. Each event generates a message which, in an event-driven architecture, is made available to all relevant applications. With events, a service does not need to know which service is interested in what it has done but simply that someone might be. To that end, it emits an event. Any other service that is interested can subscribe to any relevant event streams. This is the nature of a decoupled system: an emitter of event data doesn't need to know who the recipients are. The events are produced without knowing what significance they have, and they are then consumed by whatever services need to know about them.

Admittedly, decoupling in the form of pub/sub messaging has been around for a long time. What has been missing, however, are tools to analyze the event data in real-time, applying business logic to monitor and respond to changing conditions. This is the function of CEP technology. Processing an individual event in isolation is a routine task - no different than traditional transaction processing, albeit on an asynchronous basis. But what about the need to analyze incoming events in the context of other events and past events - things like spotting trends or patterns or reacting to aggregate information across group of events? This is where CEP adds intelligence to an event driven architecture.


Don't miss the first ever virtual event on what promises to be the next leading edge of technology at ebizQ's Event Processing virtual conference. Sign up here!

One common misconception, however, is that an event-driven architecture is an alternative to SOA - that one must be chosen over the other. The reality is that they are complementary, thus the term "event-driven SOA." Clearly, it would not make sense to implement all application interaction using an event paradigm - indeed, there are many transactions that require the request-response paradigm. But for all those that don't, the advantages of the event paradigm are clear, and event processors can be deployed within a service-oriented architecture as an overlay rather than a replacement. In fact, if existing services are producing event information, event processors can even be deployed non-intrusively, not requiring any modifications to existing system components.

Within a service-oriented architecture, complex event processing may play a number of roles:

  • Gather information about individual services; normalize, filter, and aggregate the information; and deliver it to other services that only require summary data.
  • Monitor services to detect opportunities or threats (i.e. situations that require a response); automatically initiate the response or send an alert that a response is required.
  • Synchronize distributed services ensuring that changes in the state of one service are communicated as needed to other services.
  • Capture event data across a collection of services, recording the event history and/or providing a single point for determining the current state of the system.

In the end it is not about making a choice between SOA or EDA. The advantages of a service-oriented architecture have become clear over the past few years. However, as today's businesses grapple with the need to react faster to stay ahead of the competition, and at the same time struggle to aggregate, analyze and act on an avalanche of data, the ability to add real-time automation and intelligence to a service-oriented architecture can deliver significant value. CEP puts it within reach.


Don't miss the first ever virtual event on what promises to be the next leading edge of technology at ebizQ's Event Processing virtual conference. Sign up here!

More by Jeff Wooten

About Aleri

Aleri is the leading provider of enterprise-class complex event processing technology for financial institutions and beyond. Aleri’s superior Streaming Platform is backed by the company’s deep background and knowledge gained over 20 years of supporting mission critical banking applications for the world’s largest banks and close to 10 years of pioneering research in the field of event processing. The Aleri Platform was designed from the ground up to provide the most robust architecture available for the rapid implementation of mission critical applications within the most demanding environments. Built for high throughput with minimal latency, Aleri’s event processing technology allows customers to analyze and respond instantly to high-volume, high-speed data to minimize risk and increase competitive advantage. Aleri is the first to develop and deploy a commercial enterprise-class application built on event processing technology, the Aleri Liquidity Management System, which is used by some of the largest global bank treasuries in the world. Aleri is a global company headquartered in Chicago with offices in New Jersey, London, and Paris. For more information, visit www.aleri.com.