By John Michelsen, Founder and Chief Architect, iTKO
There has been a great deal of industry buzz around "virtualization"
as it relates to SOA. This is with good reason, as business continually drives
IT to accomplish more with fewer resources.
Huge rewards are being reaped by eliminating servers from enterprise IT operations.
But outside the data center, for the rest of the SOA application lifecycle,
our ability to benefit from virtualization only goes so far. When you start
thinking more from a test and development resources perspective, and the availability
of the actual implementation for the purposes of validating and developing for
SOA, you realize that we need to extend virtualization into the actual distributed
software components and services running on those environments.
Let's define how virtualization techniques and technologies can be applied
in today's enterprise development and deployment lifecycles.
The first and most often mentioned type of virtualization is hardware virtualization.
This is not an SOA-specific thing. This is when you're running many copies of
the operating system within one physical hardware device so that you can get
independence of those several virtual machines from each other.
I certainly like this technology, especially in a SOA, as this can help with
deployment efficiencies and versioning of services. The more I can isolate the
configurations for all the individual applications running on a server to support
my services, the better.
It would be ideal if I could share one server-class machine among, let's say,
10 different services . But if I tried to put them all on one operating system,
I inherently have configuration challenges due to the co-location of the services
on one box. This creates conflict when different teams want the optimal environment
for their service implementations. In fact there are some times when they are
just downright mutually exclusive.
That's where virtualization of hardware comes in. I still get that one server
box, but within it I set up 10 different operating system installs on one physical
infrastructure, and I am able to independently manage all of those different
machines, each having its own configuration.
SOA testing consultants will often find customers are able to move forward
with a lot more confidence by validating their deployments using automated testing
in conjunction with virtualized hardware. If the new configuration or build
fails the battery of tests in production, the test fails and they simply roll
back to the previous configuration, and treat the next configuration as a virtualized
test bed for further refinement.
The second type of virtualization is virtual endpoints. In a sense what you're
doing is creating a virtual location for your consumers to access to invoke
the service when in fact you're completely shielded from the actual end point
of the service itself. So there is more disconnection between the consumer and
the producer in that kind of virtual service end point.
There are a number of upsides to this approach. First, no one wants hard-coded
URLs from machine to machine within the client side of their applications. As
a service producer, I may need to start scaling up to multiple hardware devices
or inter-clusters, or I may need to start creating geographically distributed
servers to support the growing application consumption of my service over time.
The intermediary can then use runtime policy or infrastructure availability
rules to determine which of the potential end points I actually deliver the
request to. In doing this, I'm able to manage and model how I want my consumers
to hit the services that I produce. I can actually put policy around how, how
frequently, and when that access happens. I can even start to distribute it.
An SOA tester will want this same level of flexibility in how it invokes and
verifies the behavioral and performance integrity of these services. By tying
into UDDI repositories, SOA tests are dynamically pointed to the appropriate
virtual service location, ensuring that the test workflow remains supported.
In addition, the service provider can specify a "test channel" or
test version for their services, so a test can identify itself as such to the
registry and be provided access to that specialized channel, in cases where
simulated transactions may not cause the same effects as live transactions.
So virtual endpoints are a great idea as you think about the scalability and
the loose coupling we want in an SOA.
Virtual Services: Eliminating Constraints from SOA
The third type is virtual services themselves: services that don't actually
exist -- you construct them without actually implementing them in a development
tool. This newer use of Virtualization offers extensive utility for SOA testing,
as well as providing value by streamlining development and deployment practices
as a whole.
Virtualized Services are especially important to achieving the dream of agile
SOA testing: shorter, iterative, requirement-driven test cycles with testing
every step of the way. Why? Because if you want to test earlier, you will need
to test incomplete components, or "in progress" integrations. SOA
applications are particularly prone to change, so if you have to wait for a
finished app to test, it then becomes a bottleneck to agility.
For instance, iTKO's LISA Virtualize product incorporates this approach. You
can start by analyzing a Web Services Description Language (WSDL) and from that,
generate a running service endpoint. What this means is that before your development
group has even shown up to build the actual service, the product virtualizes
the service and creates a simulated version of the real thing (which doesn't
exist yet) -- such that your consumers can invoke this service as if it already
Now, in a simplistic world this can be done in a variety of ways with mock objects,
or with other methods that provide a specific hard-coded response whenever they
are stimulated. But in reality you are going to need a fairly dynamic service,
one that captures a lot of variability in terms of data and responses, even
in a simulation. You can't spit back the same silly response to every single
request, because your consumers are going to need much more richness and variability
in the way that the service works.
So the ideal approach to SOA testing allows you to not only virtualize the
service, but to make the actual behavior of the service very dynamic (i.e.,
reading from database tables or spread sheets for values, or doing look-ups
based on monitoring all input requests and output responses). While it sounds
like you are actually building the service (which in some ways you are), in
reality you are providing just enough very high productivity logic within the
service that you have virtualized so that the consumers get what they need,
without having to wait for the actual service to be completely developed and
In fact, there's a great "test first" model here where developers
of the actual service will know they're done when tests written against the
virtualized service work on the actual service. The consumers are confident
that they will get what they need from the actual service when the virtualized
service has been sufficiently executed against. And with that virtualized service
testing out of the way, you can now point the actual service consumer to the
actual service producer, and therefore have a much more successful integration
from the very first touch.
About the Author
John has over fifteen years of experience as a technical leader at all organization levels, designing, developing, and managing large-scale, object-oriented solutions in traditional and network architectures. He is the chief architect of iTKO's LISA automated testing product and a leading industry advocate for software quality. Before forming iTKO, Michelsen was Director of Development at Trilogy Inc., and VP of Development at AGENCY.COM. He has been titled Chief Technical Architect at companies like Raima, Sabre, and Xerox while performing as a consultant. Through work with clients like Cendant Financial, Microsoft, American Airlines, Union Pacific and Nielsen Market Research, John has deployed solutions using technologies from the mainframe to the handheld device.