Blanking Panels: How Sealing Small Gaps Can Lead to Big Savings

by | Apr 29, 2015 | Blog

Imagine walking into your computer room—equipment is humming, everything seems to be running normally, but…your overall room temperature is noticeably higher than before!

Previously, this scenario might have caused you concern and prompted an investigation. But now, you happen to know that everything is normal. In fact, it’s not only normal, but your environment is better than when you kept your computer room at a lower operating temperature. How can this be?

With optimal airflow management, this picture is definitely achievable, and without being cost-prohibitive for already tight capital budgets. So, what would it take to achieve optimal airflow management and improve your data center’s Power Usage Effectiveness (PUE), which would then allow you to raise your computer room temperatures?

One important thing to point out is that the goal of all airflow management initiatives is to improve the intake air temperatures to IT equipment. More specifically, to reduce the highest intake air temperatures so that all intake temperatures are as low and even as possible. This, in return, enables changing the control of cooling infrastructure to improve efficiency and increase capacity. In order to achieve these goals, there are many steps along the way that need to be taken in order to optimize airflow and thermal management. One thing to keep in mind is optimization is a process, not an event. Our 4 R’s Methodology provides a protocol to follow in order to achieve this process. Just as a reference, the 4 R’s are: 1. Raised Floor, 2. Rack, 3. Row, and 4. Room.

The topic at hand, implementing blanking panels, falls under the 2nd R (Rack level) and should be executed after airflow inefficiencies in the raised floor have been assessed. By installing blanking panels, and properly sealing your IT equipment cabinets, you can realize significant annual cost savings in power demands for your infrastructure cooling. And this airflow management ‘best practice’ could pay for itself in just a few months. At the same time, you’ll be increasing and maximizing the cooling capacity of your existing data center space—all while reducing operating expenses and preserving (or increasing) availability. This could also enable you to push out additional cooling infrastructure capital costs down the road.

While there are many types of blanking panels available on the market today, most of them do not effectively seal the vertical plane along the face of IT equipment intakes. The majority of blanking panels have a 1/16” or larger gap between panels, which, if fully installed in a standard 42U rack, translates to 2.5” of open space (1.5U). This open space, although minimal, can result in exhaust air circulation which can reduce the reliability of equipment and unnecessarily reduce the efficiency and capacity of cooling units, ultimately resulting in higher operating costs. Our HotLok® Blanking Panels was specifically designed to completely and effectively seal IT equipment cabinets by creating a complete seal between panels. This unique design eliminates bypass airflow at the rack level, and prevents exhaust air circulation, increasing the cooling capacity of your data center infrastructure.

We recently commissioned a third-party, two-dimensional Computational Fluid Dynamics (CFD) analysis to study the effectiveness of our blanking panels on airflow patterns and IT equipment intake-air temperatures within equipment server cabinets. Two financial impact case studies were implemented to demonstrate how installing the HotLok® Blanking Panel solution yields cost savings, one for a high-density facility (400 cabinets@3.4 MW total critical load) and one for a lower density facility (175 cabinets@600 kW total critical load).

The results? HotLok® Blanking Panels were proven to prevent the circulation of hot exhaust air to IT equipment intakes. Conversely, competitor products that leave small gaps between panels and equipment allowed 19 percent hot exhaust air circulation. This can reduce equipment reliability and unnecessarily reduce the efficiency and capacity of cooling units, which ultimately results in higher operating costs.

The CFD analysis also revealed that HotLok® Blanking Panels (compared to panels that leave gaps) actually reduced average intake-air temperatures by 7°F (3.9°C). This means that once HotLok® Blanking Panels are installed, you could increase the temperature set points in the computer rooms by 7°F (3.9°C) and the maximum intake-air temperature of IT equipment would not be affected. Besides, the cooler a computer room’s operating temperature, the more likelihood there is for latent cooling (condensation on the coils). For example, you could save reduce your energy costs by about 28% simply by setting your computer room temperature at 72°F (22.2°C) instead of 65°F (18.3°C).

The bottom line for these two facilities…

  • The high-density facility realized a monthly savings of $11,450 ($137,395 annually)
  • The low-density facility realized a monthly savings of $2,550 ($30,594 annually)

3 Comments

  1. Barry Barlow

    Do not forget the gap between the 19″ and the side panels as that is a much larger potential leakage are than between panels
    also between the rack and the floor as I have seen racks on adjustable feet with a 50mm gap between rack base and floor, this needs sealing as it is where the air pressure in a cold aisle is the highest.
    Just a point on the front panels well made panels should be designed to to IEC297-1 and that stipulates the size of a front panel as being 0.4 unbder which means the panel should have a clearance of 0.4mm less than the height of the aperture so 2 U high panels wold be 44.45mm x 2 less 0.4mm so the gaps would be far less than you suggest

    Reply
  2. Howard Blevins - Upsite

    @ Dean – thanks much for your comments, and feedback on pallet pricing. That is an excellent idea to keep costs lower for large-scale installations. You touched on an important point, that the Upsite HotLok is is a “premier” product, as noted in this post, and does not allow the gaps that can add up to significant bypass airflow volumes with associated inefficiencies and costs. Of course every data center is different, but a systematic approach to airflow management using the “4 R’s” of Raised floor, Rack, Row, and Room implemented using soundly-engineered products (e.g. KoldLok and HotLok) will result in the ability to realize savings through management of device inlet temperatures, airflow volumes, set points, etc. Again, thanks for your comments!

    Reply
  3. Dean Stoneburner

    I agree that blanking panels are one of the most ignored efficiency steps available, when aisle containment is not compromised by other room and configuration characteristics, and your product is a premier offering. Cost is the only rational sales objection, when you add up the number of these things needed for medium to large data centers (pallet pricing might be useful).

    Reply

Submit a Comment

Your email address will not be published. Required fields are marked *

Subscribe

Archives

Airflow Management Awareness Month

Free Informative webinars every Tuesday in June.

Cooling Capacity Factor (CCF) Reveals Data Center Savings

Learn the importance of calculating your computer room’s CCF by downloading our free Cooling Capacity Factor white paper.

Subscribe To Our Newsletter

Subscribe To Our Newsletter

Be the first to know when there is a new Upsite Blog (once weekly).

You have Successfully Subscribed!

Pin It on Pinterest