While the aim of cloud computing is far from new, the technologies that enable its reality finally made their debut in 2010. This year we saw the release of VMware vSphere 4.1, which added significant core capabilities that helped solution providers see some results from cloud computing's promise.
Back in 2009, Gartner identified both "virtualization" and "cloud computing" as two of its top 10 strategic IT technologies. By the third quarter of 2010, Gartner had repositioned cloud computing in general slightly past its peak of inflated expectations, suggesting that this computing approach has finally grown past the fully-vaporware phase and into more-general acceptance.
Private clouds: 2011 business opportunity or still in the works?
Yet cloud computing in general is in many ways less exciting than the impending promise of private cloud computing.
While private cloud computing currently sits before the hump of Gartner's curve, movements in this space in 2010 alone suggest the IT business market is finally recognizing its value. The hardest part, it seems, has been in defining the true difference between "cloud computing" and "private cloud computing". In my eyes, the difference has been the single greatest hurdle in its acceptance: Customers simply haven't known what it is.
Looking toward 2011, developing the embrace and recognition of private cloud computing appears to be at the top of the list for solution providers. Hardware vendors in 2010 are finally releasing their first generation of equipment that has been truly designed with virtualization in mind, which is an important first step.
Hardware vendors in cooperation with virtual platform vendors are only beginning to formalize the metrics that constitute resource supply and demand within virtual environments. These cooperative relationships in 2011 and beyond are creating a future where supplies of resources can be accurately quantified along with the demand for resources exerted by running VMs.
Call it the Economics of Resources, this definition of private cloud computing you'll be hearing more about in the near-term. So 2011 may be the year that performance and capacity management in virtual environments finally provide actionable information, and both of these activities are foundational to private cloud computing. You simply can't enable private cloud computing in your customer's environment without recognizing the supply and demand of resources.
How VDI solutions fit into your 2011 business plan
The new, important virtualization focuses from 2010 going into 2011are desktop virtualization and client virtualization. Both are very similar in what they aim to achieve (e.g. better management), but very different in how they accomplish it.
I saw 2010 as the year where desktop virtualization finally realized that it was a solution rather than the solution for the current problems of IT. In the years leading up to 2010, one might have gathered the notion that Virtual Desktop Infrastructure (VDI) solutions would be the definitive answer to IT's biggest problems. But, just like Terminal Services in the decade before, many IT shops and industry pundits began to realize in 2010 that the loftier goal of "application and data delivery" was more important to their business than any specific technology approach.
VDI solutions, as the industry is now discovering, provides only one out of a wide range of solutions that fulfill that goal. History, as it seems, repeats itself in the IT industry just like everywhere else.
The emerging notion of client virtualization hasn't gotten much industry traction yet, and the application delivery method likely won't in 2011 either.
Client virtualization is much different than desktop virtualization and seeks to abstract your endpoint device from itself. By doing so, you gain even greater flexibility in migrating yourself between devices, and in some cases between device form factors. Client virtualization portends to revolutionize IT asset management, if we can but figure out how it'll work. We're getting ever closer, in my opinion, if we're not there just yet. Solution providers should keep an eye this space, even if they'll be watching it for a while before acting.
This past year has, in all honesty, been a bit (pardon the term) boring in the virtualization world. There have indeed been releases, but the vast majority of them were evolutionary, shoring up gaps in features rather than introducing radically new technologies. Pretty much everybody has virtualized to some point by now, which has (even if not completely) taken away the fun of telling people "how" to virtualize.
The events of 2010 and what I think will happen in 2011, however, will set the stage for the medium-term years to come where some big changes are anticipated. Those changes will occur in the realms of desktop virtualization (minimally), client virtualization (more so), hybrid cloud computing (not discussed here, but a potential game changer for service delivery) and vast near-term improvements in hardware which has been designed with virtualization in mind.
What should you do in 2011? Where should you point your customers? You should be up to date on virtual hardware improvements and spend the time and money now to get everyone off of old desktop operating systems (OSes) and onto today's technology. That migration might involve virtualization, but it might not. Modern desktop OSes such as Windows 7 are absolutely necessary if your customer needs to keep up with exciting architectures such as desktop virtualization, client virtualization and others that are coming.
About the author
Greg Shields, MVP, vExpert, is a partner with Concentrated Technology. Get more of Greg's tips and tricks at www.concentratedtech.com.