By Yuval Shavit, Features Writer
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
One of the main advantages of blade servers, their space-saving density, is also their biggest disadvantage. Blade servers can be difficult and expensive to keep cool, and if your client needs more than one or two chassis, it may have to make significant investments in cooling equipment. In the final installment of our Hot Spot Tutorial on blades, we'll look at the problems around cooling blade servers and how to address them.
In general, it takes about half as much energy to cool a server as it does to power it, said Dan Olds, principal at Gabriel Consulting Group Inc. in Beaverton, Ore. That's true of blades as well as standard servers, but cooling blade servers often requires a more sophisticated approach of delivering cool air to servers. Many companies, especially small and medium-sized businesses (SMBs), basically pump cold air into nonspecialized server rooms and rely on the law of averages to keep equipment cool. While this approach may work with standard equipment, blades are denser and produce more heat than standard servers. This means your client would either need to pump an exorbitant amount of cool air into the room or come up with more sophisticated ways of directing cool air to the server components that need it so less cold air is wasted on the room's ambient temperature.
Of course, more advanced cooling techniques are expensive, and they often require specialized skills as well as technology. Cooling blade servers can be cost-prohibitive for many SMBs, said Peter Sacco, president of PTS Data Center Solutions in Franklin, N.J. Although more than 85% of his clients have bought blades, most of them soon gave up and switched back to standard servers due to the cooling concerns. Extra cooling equipment also takes up space, thus defeating the space savings afforded by the density of the blade servers; at SMBs, the net savings on floor space is often negligible, Sacco said.
If your client does need blades, it may need to invest in cooling equipment and infrastructure. Unfortunately, the most common server room feature, a raised floor, is also one of the more expensive and difficult to install. Depending on the room, a raised floor may be altogether impossible. Instead, you may want to consider setting up hot and cold aisles, in which air is pumped horizontally through servers, front to back, rather than vertically, from the floor. In this configuration, server racks are arranged in rows to create aisles of air between them. These aisles are alternately designated for either hot or cold air. A/C units pump cold air into the cold aisles, where it flows through racks and to the hot aisles before being picked up and pumped outside or back to the A/C units.
Whether you use hot and cold aisles or a raised floor, you may need to look into modeling to figure out how air will flow in the server room and how to optimize it. These calculations, known as computational fluid dynamics (CFD), require specialized mechanical engineering skills, but are often useful for intense cooling situations -- such as if a client is cooling blade servers in several racks. Even if you don't have the staff to provide that in-depth analysis, make sure the server room is at least tidy. Server room clutter inhibits air flow, so organizing a room is a quick and easy way to make cooling blade servers easier.
Although water cooling remains a taboo in many IT departments, it's a powerful option that you should consider for blades. Water is a better conductor of heat than air, making it a better coolant. Better yet, water lets you easily direct the cooling exactly where it is needed, in contrast to air, which you need to pump into a general area and try to direct as well as you can using CFD calculations. One of the main downsides of water cooling is that it requires equipment upgrades, such as water blocks and piping. Some server rooms may not be equipped with water pipes, and installing them may be too expensive.
For many clients, the biggest inhibitor to water cooling is just the fear of letting water in the data center, let alone that close to expensive equipment. Water cooling is an established and safe technology, but what-if thoughts can turn managers off from it. If your client is hesitant to install full water cooling, you may want to suggest water-cooled server rack doors instead. These doors are essentially radiators that fit onto the fronts or backs of racks and can substantially cool air as it moves through them, making them a great complement to hot and cold aisles.
Of course, the easiest way to get around the difficulties of cooling blade servers is not to install them in the first place. Although virtualization is often coupled with blades, you can also deploy it on standard servers. This will let you reduce the number of physical servers your client needs, which could help slow the server room growth that blade servers address. If you provide hosting or managed services, you should consider bringing them up as an alternative to on-site servers. This can be especially attractive in a down economy, when companies look to outsource functionality in order to stay flexible and reduce long-term capital expenses.