We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.
Start a Discussion
Cloud Computing
user-pic

Is it better to use virtualization for some business apps than the cloud?

Vote 0 Votes
According to a recent survey quoted on this blog by DevX, companies are more likely to virtualize their apps than move them to the cloud.  Does it makes sense to do this, and if so, for what type of applications?

7 Replies

| Add a Reply
  • It makes sense since the move to the Cloud requires Virtualization to prepare the ground. The transition to Cloud affects people and as such it should be done carefully.

  • A lot has been said about this topic over the past months on this very forum, so I won't rehash the details. Suffice it to say that in most cases virtualization is the stepping stone to a Cloud. Typically a virtualized data center leads to what is known as a "private" Cloud, which ultimately helps companies figure out realistically what makes sense to farm out to a "public", "hybrid", or "community" Cloud and what to keep in house.

    The above analogy is akin to saying that "babies are more likely to crawl than to walk and run". Motherhood and apple pie, anyone? :)

  • Well first define "the cloud". Let's assume for a moment you mean a general, cloud computing environment without regard to location. Based on this premise yes, there are some business applications that would not in any way benefit from being deployed in what is essentially an elastic environment but may, perhaps benefit from being virtualized.

    It may also be the case that there is no reason for the application to be virtualized other than to afford it mobility within a specific environment not because *it* needs the mobility but because *other* applications may need its resources while it is not using them. Business apps that are only utilized on a periodic basis, for example, may provide value in being virtualized simply because they are easier to shut down, start up, and move around without the headaches typically associated with a deployment. The resources it consumes (that are dedicated to it via a virtual machine) may be needed to serve other applications during "down times" in which the periodic application simply isn't used so being unavailable is not an issue.

    Applications are not islands of business capabilities, they operate within a much larger operational and business context that should be given more consideration than just whether or not the app itself would or would not benefit from virtualization and/or cloud computing.

  • Talking about cloud and virtualisation together usually begs people to draw a false equivalence between two things that - at least in my mind - are entirely different in their impact and importance. Virtualisation is a technology that can increase efficiency in your data centre and which might be leveraged by some cloud providers as well. That's nice and it can reduce the costs of hosting all your old cack in the short term. Cloud on the other hand is a disruptive shift in the value proposition of IT and the start of a prolonged disruption in the nature and purpose of businesses.

    In essence cloud will enable organisations to share multi-tenant business capabilities over the network in order to specialise on their core value. Whilst virtualisation can help you improve your legacy mess (or make it worse if done badly) it does nothing significant to help you take advantage of the larger disruption as it just reduces the costs of hosting applications that are going to increasingly be unfit for purpose due to their architecture rather than their infrastructure.

    In this context I guess it's up to people to decide what's best to do with their legacy apps - it may make sense in the short term to move them onto virtualised platforms for efficiencies sake (should it cost out) in order to clean up their mess during the transition stage or perhaps (as Lori suggests) manage many periodic workloads more easily.

    In the longer term, however, people are going to have to codify their business architecture, make decisions about their core purpose and then build new cloud services for key capabilities whilst integrating 3rd party cloud services for non-differentiating capabilities. In this scenario you need to throw away your legacy and develop cloud native and multi-tenant on higher level PaaS platforms to survive - in which case VMs have no place as a unit of value and the single tenant legacy applications deployed within them will cease to be necessary. In that context the discussion becomes a strategic one - how aggressively will you adopt cloud platforms, what does this mean for the life span of your applications and how will it impact the case for building a virtualised infrastructure (I'm assuming its a question of internal virtualisation rather than IaaS due to the nature of the original question). If it doesn't pay back or you're left with fairly stable applications already covered by existing kit then don't do it.

    Either way - don't build new systems using old architectures and think that running it in a virtualised environment 'future proofs' you; the future is addressing a set of higher level architectural issues related to delivering flexible, multi-tenant and mass customisable business capabilities to partners in specialised value webs. Such architectural issues will increasingly be addressed by higher level platform offerings that industrialise and consumerise IT to reduce the issues of managing the increasingly complex list of components required to deliver business systems (also mentioned as an increasing issue in the survey). As a result your route to safety doesn't lie in simply using less physical - but equally dumb - infrastructure.

  • It's hard to picture how it is better to use one than the other, when both could be perfectly applicable. I would expect virtualization to be a commonly accepted precursor to leveraging cloud-based environments for applications.

    There will still be some over-utilized business app resources that are also resistant to both - it doesn't make sense to try to replicate a 2TB mainframe as a VM, or stick it in a public cloud...

  • Please allow me a bit selfish comments.
    Not only types of applications, but also project phases come into my considerations.

    I have one production environment on Amazon EC2. And to make it happen, I have ten virtualized development and testing environments locally, VMware Workstation. During development and testing phases, it has been quite convenient going back and forth among different application versions, systems and/or system versions.

    If something goes really wrong, I just can leave the environment, create a new fresh clean environment, continue my work. Then later, I can come back to the wrong one, and spend longer time to analyze it when it becomes possible.

  • I think that Lori and Ian dot i’s and cross t’s here.

    Virtualization and Cloud are adjacent and complementary technologies, but I do not consider them as alternatives to each other.

    Furthermore, one of the key advantages of Cloud besides efficiency and cost on the Server side (thanks a lot to elasticity) is the ubiquitous deployment to users – and when it comes to business apps this would imply a RIA client.
    Both elasticity and RIA designs are really a different cup of tea (from Virtualization).

    This being said, Kengaku’s comment shows pragmatically how virtualization is very useful and appropriate for business apps at large.

Add a Reply

Recently Commented On

Monthly Archives

Blogs

ADVERTISEMENT