Best techniques for cooling blade servers

Blade servers present a much more complex cooling dilemma to data center engineers than standard servers do. The typical approach to cooling racks is exactly the wrong method for cooling blade servers. Find out what works and how your customers can best plan for future growth.

By Yuval Shavit, Features Writer

Increasing energy costs are forcing companies to cool their data centers more efficiently, and blade servers exacerbate the problem. A single blade rack can take more than 25 kW, up from just a couple kW for a standard server. Much of that energy is converted to heat, so cooling blade servers often requires taking a fresh look at how the server room or data center is being cooled from the ground up.

With standard rack servers, cooling was often seen as a game of averages, according to Peter Sacco, president of PTS Data Center Solutions in Franklin, N.J. If you knew how much electricity your client's hardware used, you could calculate how much heat it would produce and how much cool air you would need. You could then pump that air into the room with minimal thought about the specific path it took; hot air from the servers would mix with cool air from the air conditioners, and the room's average temperature would stay within the hardware's operating limits.

But mixing hot and cold air is exactly the wrong approach to cooling blade servers, Sacco said. Instead, your goal should be to get a specific amount of cold air to the blade rack as directly as possible and then ventilate the now-heated air out of the room as quickly as possible.

Because blade racks require more precise ventilation, PTS uses computational fluid dynamics (CFD) to model how air will move through the data center given the area's physical properties and cooling capabilities. CFD allows you to predict how air will flow around the data center and how different temperatures of air will mix. These calculations let you compute more accurately how much cold air the data center will need, as well as the best way of getting it to the servers.

Finding qualified mechanical engineers to do CFD calculations is one of the hardest parts of running a server room consultancy, Sacco said. Experienced workers are generally invaluable in training younger employees, but they tend to take the old, mixed-air approach that doesn't work for cooling blade servers. The good news is that the data center designer is becoming the "rock star of the industry," Sacco said, so more mechanical engineers may be drawn to the work.

New problems, new configurations

Because blade servers require more directed cooling, savvy data center engineers are starting to rethink some traditional rules. For instance, some are reducing their reliance on raised floors -- or doing away with those floors altogether -- in favor of in-row cooling. With this approach, engineers set up alternating columns of cold and hot air; instead of cold air moving up through a server, as it does with perforated floor tiles, it travels horizontally, from the back of the rack to the front.

In-row cooling saves money on building a raised floor, and advocates say that concentrating the cold and hot air lets you more easily move cool air through servers. But Sacco said that while in-row cooling is the best approach in some situations, it isn't always. A lot of the talk around in-row cooling comes from APC, which produces in-row coolers, he said.

Another way of cooling blade servers that's getting more and more attention is water cooling, said Don Beaty, president of DLB Associates, an engineering consultancy in Ocean, N.J. With water cooling, cold water is piped to a special water-cooling heat sink, called a water block, on the processor. While a standard heat sink has metal fins to increase its surface area with the air around the server, a water block consists of a metal pipe that goes through a conductive metal block. The processor heats the block; cold water travels into the block, cooling it back down and warming the water, which is then piped out to a radiator, which cools it again. Water is a much better conductor of heat than air is, so the cooling process is more efficient. And because water goes straight to the server, you won't need to worry about hot-cold mixing or CFD.

Hot Spot Tutorial: Server room cooling
Learn more about server room cooling in our Hot Spot Tutorial for service providers.

One of the biggest fears with water cooling is that a broken pipe will ruin a server, but liquid cooling is a proven technology, and people are starting to give it a second look. It does have more of a capital overhead, though; many server rooms already have air ducts and enough electric sockets for cooling units, but water cooling requires piping that you'll often have to install. Your client will also face a capital expense for water blocks and failsafe systems in case of a leak or other malfunction.

Don't expect to have much more free space after upgrading your client's data center, Sacco said. Many people buy equipment like blade servers expecting to free up floor space, but the space those servers free up is mostly taken by the extra cooling units they need, he said.

Cooling blade servers into the future

If your plans for your client's data center call for significant capital investments, you should design the infrastructure to be scalable. Buying cooling units that are big enough to handle the cooling capacity your client expects to need in a few years is a simple approach, but it's expensive and inefficient, Beaty said. Not only will your client have to buy larger, more expensive equipment, but the units will be working below their peak efficiency band, he said.

Instead, plan on using more but smaller units, Beaty said. When you're designing the server room, build in the infrastructure -- such as ducts, water pipes and electric sockets -- for more cooling units than you'll need at first. As the hardware in the server room grows and needs more cooling, whether for blade servers or other types of hardware, your client will be able to just drop a new cooling unit at one of the unused spots. This process is less trivial for large, complex data centers, but the principle can still work, Beaty said.

Of course, not all projects are large. A small business will have a small data center, but it's still vital to the business that the servers run well and cheaply. In the final installment of our Hot Spot Tutorial on data center cooling, we'll look at how to approach smaller server rooms.

This was last published in July 2008

Dig Deeper on Server management, sales and installation

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.

-ADS BY GOOGLE

MicroscopeUK

SearchSecurity

SearchStorage

SearchNetworking

SearchCloudComputing

SearchDataManagement

SearchBusinessAnalytics

Close