While it’s important to stay up to date with cloud computing technologies, the underlying virtualization layer...
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
is still very much a crucial aspect of today’s solution provider business.
Virtualization started out as a technology used mostly in testing and development environments, but in recent years has become mainstream in production servers and major vendors are tying together virtualization and cloud offerings.
It may be a slow process, but customers are starting to build cloud infrastructures and Infrastructure as a Service (IaaS) or Platform as a Service (PaaS) products largely rely on a virtualized environment. From saving customers on physical server costs and improving performance to being used for building out private cloud computing environments, server virtualization has matured greatly. It is quickly becoming a staple of many customers’ long-term IT strategy.
Numbers from a SearchDataCenter.com IT professional survey show that first-time virtualization deployments, are down from 9% in 2010 to 6% in 2011, but 59% of those IT pros are already using server virtualization. Just because there aren't as many first-time deployments, that doesn't mean there are fewer opportunities for VARs. Most organizations aren't even close to 100% virtualized, so VARs can still help them virtualize more workloads and improve monitoring, management, backup, security, etc.
Server virtualization considerations
The biggest advantage to virtualizing your customers’ servers is the ability to consolidate several physical servers onto one machine. Servers frequently run with idling CPUs and plenty of memory to spare -- meaning your customers are paying for computing and resources that they don't need. Running multiple virtual machines (VMs) on a server lets your customer use more of its capacity.
A few of the main virtualization challenges are security, storage and VM sprawl. Besides using resources more efficiently, virtualization can also be risky because it puts more of your customers’ eggs in one basket. If the host machine breaks or needs to be taken offline, several virtual servers will go down. You can solve this by setting up a redundant server; if the primary server goes down, the secondary server will run the VMs until the primary one is fixed.
A robust virtualization project should include at least two physical servers that both have access to the same storage device, said Scott Gordon, sales engineer at ActivSupport, a San Bruno, Calif., networking consultancy. Because VMs are just data files, a shared storage system lets both physical servers, or hosts, access the same VMs, making it much quicker to recover if one server fails. A storage area network (SAN) is best for this, since it allows for nearly instantaneous recovery, but a network-attached storage (NAS) device or even backups can bring your disaster recovery plan's recovery time objective (RTO) down to minutes instead of days, Gordon said.
In fact, disaster recovery is one of the common reasons customers cite for adopting server virtualization, said Ty Schwab, founder and senior consultant of Blackhawk Technology Consulting LLC, a Eugene, Ore., IT consultancy. He worked with the Alaska Railroad to set up a virtualized server with an alternate site at a building across the tracks. The system could fail over within an hour, he said.
Performance is an important factor in server virtualization, especially for applications such as databases that require a lot of disk activity. The prevailing wisdom three years ago was that databases should still run on dedicated physical servers, But this line of thinking is changing rapidly and it’s becoming much easier for VARs to virtualize databases without performance concerns
The “Big three” virtualization products
The three major server virtualization technologies are VMware vSphere, Microsoft Hyper-V and Citrix XenServer. Each offering has a free version and live migration, but it’s up to you to help customers determine which fits best in their environment. Hyper-V R2 tools, for example may be a better match (and price) than vSphere or XenServer tools. Other virtualization options include open source Red Hat Enterprise Virtualization (RHEV) with KVM virtualization.
VMware is still the server virtualization market leader, but the gap isn’t nearly what it was three years ago because Microsoft and Citrix are putting out free, competitive offerings. VMware vSphere 5, which offers new Auto-Deploy, Profile-Driven Storage and Storage DRS features, is relatively expensive and likely to entice your larger customers. VMware also offers the free vSphere hypervisor based on the ESXi architecture.
Microsoft was late to the virtualization game with Windows Server 2008’s Hyper-V that debuted in 2008 and its Quick Migration technology was not a big hit with customers. But Hyper-V R2’s Live Migration technology, features such as Cluster Shared Volumes, and its (technically) free pricing makes it an enticing option for some of your customers.
Citrix XenServer 6 , now out in beta, will be geared toward public cloud service customers and offers customers the distributed virtual switch (DVS) as the default option for networking within XenServer. XenServer is based on the Xen open source project, which was owned by XenSource until that company was bought out by Citrix in 2007.