3 Commonly Overlooked ‘Holes’ in Data Center Airflow Management9 min read

by | Nov 30, 2016 | Blog

As a result of rising computer room densities and increasing business demands, companies are pressured to reduce operating costs and increase cooling capacity in their data centers now more than ever before. Given that the cooling infrastructure often consumes approximately half of total data center power, improving airflow management is the proverbial low hanging fruit in terms of cooling infrastructure improvement. While many facilities have implemented common forms of airflow management such as raised-floor grommets or containment, there are a few commonly overlooked ‘holes’ in the data center that can be limiting you’re potential cost savings.

Before we jump into these commonly overlooked holes, one important thing to point out is that the goal of all airflow management initiatives is to improve the intake air temperatures to IT equipment. More specifically, to reduce the highest intake air temperatures so that all intake temperatures are as low and even as possible. This, in return, enables changing the control of cooling infrastructure to improve efficiency and increase capacity. In order to achieve these goals, there are many steps along the way that need to be taken in order to optimize airflow and thermal management. One thing to keep in mind is optimization is a process, not an event. Our 4 R’s Methodology provides a protocol to follow in order to achieve this process. Just as a reference, the 4 R’s are: 1. Raised Floor, 2. Rack, 3. Row, and 4. Room.

Initially, a lot of focus was placed on optimizing the raised-floor plenum to help deliver conditioned air to IT equipment. Today, it’s common practice to seal openings in the raised floor, such as those resulting from cable cutouts.

More recently, data center operators are turning their attention to aisle/row level airflow management strategies. Aisle containment, hot and cold, is becoming more common. However, with all the focus placed on these aisle-level solutions, significant unsealed openings at the rack level are often overlooked. Similar to the holes in the raised floor, sealing these ‘holes’ in the rack are foundational to other airflow management initiatives, including containment solutions. Two of these overlooked “holes” are inside the rack and one exists under the rack.

Open U Spaces

Even though it is a well-known best practice to install blanking panels in open spaces between IT equipment in racks, many data centers have yet to install them and most have not completed the job. Considering the distance between mounting rails in a rack is approximately 17.7” wide and a single U space is 1.75” high, this results in 31 sq. in. of open area per U space. Assuming a conservative average of 10 open U per rack, an average rack has 310 sq. in., or 2.1 sq. ft. of open space. For each 100 racks, there are 31,000 sq. in., or 215 sq. ft. of open space. This is equivalent to the open area of 215 standard 25% open area perforated tiles, or 107 50% open-area grates.

Between Mounting Rails and Edges of Cabinets

The open space between server mounting rails and the sides, top and bottom of the enclosure is particularly important to seal. As these spaces are so close to IT equipment intakes, server exhaust air often circulates through these spaces and is ingested by IT equipment. A typical 42 U rack has an open area of 74” x 2” on each side of the rails. This equates to 148 sq. in. or approximately 1 sq. ft. per rack. This translates to 100 sq. ft. of open space per 100 racks.

Under Cabinets

Lastly, the space under a rack is also important to seal. A typical cabinet is approximately 2” above the raised floor. At 24” wide, the open area under each rack is 48 sq. in. This equates to 4,800 sq. in. or 33 sq. ft. per 100 racks.

Combined, these often-overlooked openings represent 348 sq. ft. of area per 100 racks. Any site that would put the equivalent of 348 perforated tiles per 100 racks would see a significant, adverse effect on airflow management at both the cabinet and room level. Addressing these overlooked holes in the data center is essential to maximizing the benefits from airflow management initiatives, especially reducing operating costs and increasing cooling capacity.

Lars Strong

Lars Strong

Senior Engineer

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

Subscribe to the Upsite Blog

Follow Upsite

Archives

Airflow Management Awareness Month

Free Informative webinars every Tuesday in June.

Cooling Capacity Factor (CCF) Reveals Data Center Savings

Learn the importance of calculating your computer room’s CCF by downloading our free Cooling Capacity Factor white paper.

Pin It on Pinterest