5 Airflow Management Fixes To Save Money In The Data Center11 min read
The task of keeping servers cool means that there is always cold air being blown around facilities. But much of the air used to cool servers never fulfills its purpose, instead leaking out of server aisles through various routes or being pushed around the server room in ways that don’t actually help eliminate hot spots. Whether in higher cooling costs, higher power costs from pulling in outside air or higher server failure rates, poor airflow design isn’t just leaking air – it’s leaking money. Facility managers can cut data center costs with certain easy fixes that control the leakage of cold air and direct airflows around server rooms more effectively.
Around 10 years ago, an Uptime Institute and Upsite Technologies study found that an average of 60 percent of computer room cooling capacity was going to waste, escaping through openings for cables and misplaced perforated tiles. Last year, a follow-up study by Upsite Technologies found little improvement, estimating that an average of 48 percent of cold air is now escaping via these routes. Additionally, the study noted that airflow management approaches were often piecemeal, carried out in ways that undermined their effectiveness. More importantly, perhaps, actual cost reductions require broader approaches than simply sealing leaks. On average, data center facilities have four times the space they actually need, which means that many of the more granular approaches only have limited effects without a broader, room-level view. Nonetheless, each step is an important component of saving money with better airflow management.
“Many airflow deficiencies can be expected in legacy data centers that have gone through numerous modifications, rearrangements and refreshes of server equipment,” a 2010 report from the Department of Energy’s Federal Energy Management Program noted. “Air leaks, obstructions, perforated tile locations, cable penetrations and missing blanking plates can cause poor air distribution that can be remedied with low-cost solutions.”
Hot And Cold Aisle Containment
Aisle containment has become a fundamental part of data center design in the last decade, but it bears repeating that setting up the data center rack configuration to avoid mixing hot and cold air can improve performance. If all server rack rows are facing the same direction, then the hot air being pushed out of one rack is part of the input for the row behind it, making it more difficult to keep server temperatures uniform and increasing the overall temperature of many of the servers.
“Orient rows in a hot/cold aisle layout to prevent the mixing of hot and cold air,” a recent Schneider Electric report stated. “This way, you’ll have more uniform IT inlet temperatures, reduce hot spots, and significantly lower your electricity consumption.”
Installing Floor Tiles And Blanking Panels
As the Upsite Technologies study suggested, misplaced floor tiles are a major source of air leakage, and data center facilities that take a raised floor approach need to eliminate missing floor tiles. Additionally, server rooms can reduce air mixing by eliminating the use of perforated floor tiles in hot aisles in favor of solid tiles, the Schneider Electric report noted. Similarly, blanking panels – thin sheets of metal or plastic used to block off open server rack spaces – are essential for sealing off empty rack space, keeping airflow predictable and maintaining the isolation of hot and cold aisles.
Close Cable Openings
Cable cutouts are the other common source of leaks, as the Upsite study noted. The ways in which cables can let out air are myriad. Floor cutouts for cables can leak up to 35 percent of cold air, according to Schneider Electric. As a result, overhead cabling can be a good approach to eliminate floor cutouts and clear up space under the floor. If overhead cabling is not an option, data center managers can still eliminate much of the wastefulness of cable cutouts by using solutions to seal them.
“[An] air isolation best practice is installing brush grommets on your raised floor to conceal the holes created by power and network data cables,” Focus on Energy executive Geoff Overland wrote in a column for WTN News. “The grommets look like a narrow box with a bristle brush opening. Your cables will be able to pass through your raised floor while the grommet will prevent air from escaping.”
Clean Up Cables And Obstructions
Poorly organized cables or other haphazardly placed elements of data center infrastructure can also disrupt airflow and cause cooling solutions to work less effectively. Removing as many obstructions, clearing as many pathways and making aisles as organized as possible are essential steps to take for minimizing the work involved for cold air to make it to servers. One possible fix is to modernize the racks for servers along with the servers themselves.
“Many of the messes in data centers are because deeper equipment with far more cable connections just doesn’t fit in the cabinets anymore,” Robert McFarlane, a principal at consulting and technology design firm Shen Milsom and Wilke LLC, told TechTarget. “So the doors are open, cables are hanging out and airflow is blocked.”
Raising Data Center Temperature Setpoints
The traditional assumption to data center cooling is that the air in the data center needs to be heavily chilled to keep servers cool. However, much of this cooling is excess needed to account for leakage and poor airflow design. Very low temperatures can ultimately cause new problems in the form of energy usage spikes and humidity inconsistencies, Schneider Electric noted. Generally, overheating problems are isolated to specific areas and can be resolved through some of the airflow management fixes covered above. With each successive improvement, companies will likely be able to set the temperature in their data center a little higher, reducing the energy cost of cooling without reducing the amount of cold air that actually gets fed to servers.
In general, many data centers can afford to be kept slightly warmer than they currently are, as safe temperatures for IT equipment under ASHRAE guidelines are much higher than previously thought. And newer hardware is increasingly resilient to high temperatures. As a result, companies can benefit from experimenting with temperature settings in general, according to Schneider Electric. While getting each dimension of airflow and cooling right is an ongoing process, starting with these fixes is a great way for companies to begin cutting costs and improving the way cold air is handled in their data center facilities.
Airflow Management Awareness Month
Free Informative webinars every Tuesday in June.
Cooling Capacity Factor (CCF) Reveals Data Center Savings
Learn the importance of calculating your computer room’s CCF by downloading our free Cooling Capacity Factor white paper.