The Future of SOA

A month ago, I wrote about the need for Service Oriented Architecture (SOA) to be based on standards if it is to fulfil its promise, and for a debate on what those standards should be. Since then, at The Open Group conference on SOA in Houston, Texas (presentations are available at http://opengroup.org/proceedings/q405/). I have seen signs of a very healthy debate. A picture of standards-based SOA is starting to emerge, based on lessons learnt from the first real-world implementation and deployment. What are the lessons learnt, and where will they lead SOA?



There is a clear pattern to initial SOA implementations. Existing applications are “wrapped” and plugged into a services bus. A registry may be added to enable service discovery, and there may be instrumentation for performance monitoring. This all works fine, because there has been no real change to the underlying enterprise architecture; it has just been expressed in SOA language.

The core business value of SOA is in delivering enterprise agility, and it is when companies try to change that the crunch comes. As Michael Liebow, global lead for SOA and Web Services within IBM Global Services, puts it, the business benefit of SOA is in service reconfiguration flexibility, with changes done in days by business people, not in weeks by technical specialists; but this means that the business and technical architectures must be aligned, which today is not the case in most organizations. Expressing an existing application architecture in SOA terms is not enough. The services must be business-oriented if they are to be orchestrated by business people.

Also, initial use of SOA is highlighting the need for semantic interoperability. David Archer, President and CEO of the Petrotechnical Open Standards Consortium (POSC), says that although SOA is providing the framework for integrated cross-company operations with information flow in real time required by current developments in the oil industry, there is a significant semantic interoperability problem that is not addressed directly by SOA. This has led to significant use of information repositories.

Most SOA implementers are still at the initial, wrapping, stage. Jamie Cruise, Solutions Architect with Landmark Graphics, says that he would love to be at the stage where legacy applications have been migrated. He looks forward to a time – hopefully within two years - when applications will talk web services natively, and will natively deliver industry–specific functionality.

This new breed of services – we should stop calling them applications - will be very different from the monoliths in use today. They will deliver granular business functions, and fit into a fabric of SOA standards in such a way that they can be orchestrated by business people to deliver enterprise agility.

What will that standards fabric look like? It must cover five areas: the services bus; the registry or repository; event handling; instrumentation, and policy. And the standards must address the semantic level, not just protocol and syntax.

The services bus is the basic communications channel that interconnects services in an SOA. It looks like one big thing that everything else plugs into, but it can be more complex. According to Hans Jespersen, Director of Integration and Web Services with Tibco, an enterprise services bus breaks down into many components, and the current trend is to make these components plug-and-play, with multi-vendor interoperability. It is interoperability through standards that customers are looking for.

The registry stores descriptions of services, enabling run-time discovery. It is the control center for orchestration. Currently, its contents are usually fairly simple. Manny Tayas, Technical Director for Federal Operations with Systinet, says that enterprises today just want to define what a service is, and how to describe and organize it. But when an enterprise moves to SOA, it exposes different relationships, such as between producer and consumer, between service and schemas, between business process and the services that it consumes. The enterprise must manage them, or it will not be able to cope with changes. A registry helps to define services, but does not describe relationships. For this reason, many people believe that enterprises will need to transition to repositories, which can store a wide range of semantic information, or to the Semantic Web.

Event handling middleware connects the service infrastructure with the outside world, providing real-time input and response. This connection can be made through an event-driven architecture (EDA), which interfaces to sensors and telemetry and provides filtering, aggregation, correlation, and complex event processing. EDA has sometimes been thought of as a competitor to SOA, but in reality the two are complementary. Gartner predicts an increasing role for event processing, with an “era of events” to follow the “era of services”.

If business executives are to adapt their IT infrastructure through service orchestration, they need information about what that infrastructure is doing and how it is performing. This is provided by instrumentation. Ease of instrumentation is one of the great benefits of the SOA approach, and the provision of management dashboards is an exciting capability. Standardization at the semantic level is needed to enable design of generic dashboards that will instrument the business-oriented services as well as the infrastructure.

Policies are the means by which design-time decisions about security, service levels, etc. are enforced in the runtime environment. For enterprise agility, definition of policies must be separated from their implementation, so that the user does not need to understand the technology. Anjan Mitra, Senior Product Manager with AmberPoint, says that there must be standard policy formats, with composite policies interpreted and enforced by management intermediaries, but the issue is, how to enable management policies for multi-vendor, multi-product-type implementations.

These five areas: services bus, repository, event handling, instrumentation, and policy, are the areas where standardization is needed to enable SOA to fulfil its promise. As Sam Ceccola, Federal CTO and Chief Federal Architect for Government Systems at BEA Systems, puts it, a single-vendor situation is not healthy; standards are the equalizer.

But standards alone are not enough. The point is, using them to meet business goals. The fundamental architecture in SOA is the business architecture: it is the services, and how to codify business processes as services. The real key to successful SOA has nothing to do with just having good technology. It is down to good architects.

Given a standards-based framework, a good architect can define granular, business-oriented services and, through the infrastructure, give the business people the ability to monitor their operation, and configure them to deliver maximum value in an agile enterprise. That is the future that we look for from SOA.

For more information, please contact Dr. Chris Harding at c.harding@opengroup.org

About the Author

Dr. Chris Harding leads the SOA Working Group at The Open Group - an open forum of customers and suppliers of IT products and services. In addition, he is a Director of UDEF Forum, and manages The Open Groups work on semantic interoperability. He has been with The Open Group for over ten years.

Dr Harding began his career in communications software research and development. He then spent nine years as a consultant, specializing in voice and data communications, before moving to his current role.

Recognizing the importance of giving enterprises quality information at the point of use, Dr. Harding sees information interoperability as the next major challenge, and frequently speaks or writes on this topic. He is a regular contributor to ebizQ.

Dr Harding has a PhD in mathematical logic, and is a member of the British Computer Society (BCS) and of the Institute of Electrical and Electronics Engineers (IEEE).

More by Dr. Chris Harding

About Open Group

The Open Group is a vendor-neutral and technology-neutral consortium, whose vision of Boundaryless Information Flow will enable access to integrated information within and between enterprises based on open standards and global interoperability. The Open Group works with customers, suppliers, consortia and other standard bodies. Its role is to capture, understand and address current and emerging requirements, establish policies and share best practices; to facilitate interoperability, develop consensus, and evolve and integrate specifications and open source technologies; to offer a comprehensive set of services to enhance the operational efficiency of consortia; and to operate the industry’s premier certification service. Further information on The Open Group can be found at http://www.opengroup.org.