By Stephen J. Bigelow, Senior Technology Writer
Endpoint computers can be difficult to manage, and desktop virtualization can be the solution. But first you need to know the pros and cons of virtual desktop software.
Every user's desktop is different, and system administrators often have little control over the variety of operating systems and applications running throughout the company. Troubleshooting typically requires significant time to verify drivers and patch levels, and admins have to perform even the most routine software updates on every PC. When you realize that client endpoints may number in the hundreds, thousands or even tens of thousands, it's easy to see how support or maintenance projects can easily tax solutions providers -- and result in costly and time-consuming projects for customers.
Virtual desktop software is emerging as a means to combat this complexity by shifting the desktop environment from each endpoint to a virtualized desktop hosted on a server in the data center. The server then supplies the desktop environment -- the operating system, applications, settings and more -- to each endpoint PC across the LAN.
Desktop virtualization pros
Desktop virtualization, also dubbed the virtual desktop infrastructure (VDI), is based on a thin-client computing model. In effect, the virtual desktop machine on the server side handles all tasks related to the OS, application processing (including visual rendering) and storage; the endpoint PC now acts as little more than a "dumb" I/O device.
Thin-client endpoints typically use some local software (often a basic OS to boot the endpoint and connect it to the desktop server). But some endpoints use a zero-client approach, which requires no software at all. These client endpoints normally boot in firmware with no local storage and connect directly to the desktop server.
The implementation of virtual desktop software offers a wealth of benefits for customers. At the hardware level, desktop virtualization saves money by easing endpoint hardware requirements. Existing PCs can serve as thin clients with no real modification, which extends their normal lifecycles. For example, a business may normally replace its PCs every two to three years, but a PC as a thin client may last five years or more.
Similarly, businesses can purchase new PCs and the new generation of thin client PCs far more economically, because thin-client endpoints don't need the same disk, memory or processing capacity as thick-client PCs. Lower hardware requirements also mean lower energy usage, which fosters power conservation and a green environment, not to mention a quieter office.
When a conventional PC fails, repairs take time and expertise. Replacing a conventional PC requires the user or technician to reinstall applications, reset preferences and recover data (if possible). In a virtual desktop environment, an administrator can swap out the faulty thin-client PC, and the user can continue working immediately -- significantly reducing the time and expense normally associated with desktop support. In addition, a user can log into a virtual desktop from other endpoints, including remote PCs, without installing software or copying data to multiple systems.
Another notable benefit of virtual desktop software is superior administration. Virtual desktop management happens almost entirely at the server level. When a new desktop is needed, an administrator can provision a pre-established virtual image and deploy a thin-client PC with a minimum of time and effort. The admin can also allocate additional server resources to virtual desktops that require more processing power or scale them back for incidental and non-essential users. Deployment of new software, patches and upgrades for all virtual desktops can happen at the server level, rather than on each individual PC.
Virtual desktop software also lends itself to enterprise security. Conventional PCs store applications and data locally, exposing the computers to viruses and spyware and risking data loss if they are lost or stolen. Virtual desktop machines use storage in the data center (true thin clients don't even have local storage), so there is no data to lose on the endpoint device. This design also centralizes antivirus, antispyware and other security products. Virtual desktops can allow some software installation flexibility for the user, but admins can also lock them down to prevent the installation of new or unauthorized applications.
Even disaster recovery planning is simplified with virtual desktop software. Conventional environments rely on PC users to back up their own data, but end users are notoriously inconsistent or infrequent with their backups, which usually leads to data loss in a disaster. Servers, on the other hand, are backed up routinely and consistently. A business struck by disaster can potentially recover and restore its virtual desktops to servers in a secondary data center and then allow users to access their desktops with little (if any) disruption.
Desktop virtualization cons
Virtual desktop software can solve a multitude of problems for customers, but solutions providers need to consider some of the potential pitfalls before embarking on a desktop virtualization project. One of the first considerations should be the network bandwidth needed to support the traffic at each endpoint.
Ideally, bandwidth requirements are limited to carrying keystrokes and mouse movements to the server and returning screen refreshes to the endpoint. When you multiply this bandwidth times hundreds or thousands of virtual desktops, it's easy to see how crucial network bandwidth is -- even when no application data is being exchanged. Solutions providers should analyze their customers' available bandwidth and determine the requirements of a desktop virtualization project. It may be necessary to propose network upgrades in advance.
Graphics is probably the single biggest issue affecting network bandwidth in a virtual desktop environment.
"One of the biggest barriers to widespread adoption of the thin-computing model in the past was … that model did not support a positive user experience if it involved graphics," said Barb Goldworm, president and chief analyst at Focus Consulting, a research and analysis firm in Boulder, Colo.
For example, consider that smooth desktop emulation requires a display refresh rate of 30 frames per second (fps), and a typical desktop may run at 1280 x 1024 x 32 bits per pixel (bpp). That works out to 1.26 Gbps of uncompressed video data per desktop. Today, there is Windows Remote Desktop Protocol (RDP) and other protocols that can enhance remote graphics behavior while limiting bandwidth use through caching, compression and other techniques.
"If you have very high-end graphics capabilities, you have to have some combination of technologies that address graphics over the network," Goldworm said.
Support for external devices such as scanners, printers and smart cards has also limited virtual desktop adoption, as has support for multiple monitors, bidirectional audio and video, streaming video and USB devices. Today the development of virtual desktop software has overcome most of those limitations, but solutions providers still must consider the performance needs of their customers and ensure that the prospective desktop virtualization platform will behave as expected. Most pre-assessments will include some lab work and evaluation to check compatibility with services or devices.
Solutions providers should also consider the server resources available to support virtual desktops. Servers handle all of the processing and visual rendering for the virtual desktops, so the servers need significant processing and memory resources. In most cases, multiple servers (or even a blade server) will be appropriate to handle desktop virtualization.
But it's important to note that the total processing requirement is not the simple sum of all desktops. Rather it's a statistical mean based on the number of processing loads. For example, if you're replacing 100 desktop PCs with 2 GHz processors, you don't need a server with 100 2 GHz processors. In reality, PCs are idle much of the time, and it takes far less processing power to handle most basic operations. Virtual desktop software vendors can typically help solutions providers determine an adequate level of server processing and memory, based on the number of endpoints.
Solutions providers and customers should never attempt desktop virtualization projects without considering server redundancy.
"There's the issue of putting all your eggs in one basket," said Brien Posey, an independent technology consultant in Rock Hill, S.C. "If you've got all your desktops hosted on one machine, and that machine goes belly-up, then you're dead in the water."
Server clustering, redundant connectivity and other tactics are essential for any type of virtualization -- but especially so for desktop virtualization, where a server outage can affect the entire user base.
Finally, storage is an issue that often goes overlooked -- until a customer realizes it's out of space. All of the processing is handled centrally, so all of the application data is stored centrally. Each virtual desktop needs the same amount of storage that would otherwise be present on the traditional desktop. For example, if the average PC stores 100 GB -- including the OS, applications and data -- there should be about that much storage available for each virtual desktop; if you're virtualizing 100 desktops, you'll need another 10,000 GB (10 TB) in the data center.
This was first published in February 2009