Liquid Cooling vs. Air Cooling: What to Consider20 min read

by | Nov 9, 2016 | Blog

Discussions on the preferred fluid for removing heat from data center equipment have historically been short enough to have been conducted through rolled down windows of cars passing each other from opposite directions. It has basically trickled down to a couple decades of “liquid is the only choice,” followed by a couple decades of, “No water in the data center.” While such simplicity of consensus has not yet devolved into roadside road rage fisticuffs, many of us have pulled over into the parking lot for an extended howdie-do. Today’s blog is intended to contribute to those parking lot conversations and cut through some of the sales and marketing rhetoric from the competing air-cooling and liquid-cooling camps. Spoiler alert: air cooling and liquid cooling are both more effective than the proponents of the competing technology would have you believe.

When data center managers are asked what is the one thing they would most like to change in their data centers, it probably doesn’t surprise many of us that the top two responses are to be more energy efficient and to have better cooling/HVAC systems. However, when these same respondents are asked to identify the most promising new technology for data centers, it may be a little more surprising that liquid cooling tied with DCIM as the top choice.1 Interestingly, only a painfully small minority of data centers today come even close to exploiting the full capability of air cooling, thereby leaving the door obviously ajar through which liquid cooling proponents can launch claims for ten-fold increases in density and 75-90% reductions in cooling operational costs. While such advantages for liquid cooling definitely exist in comparison to average air-cooling deployments and especially in comparison to most legacy air-cooled data center spaces, those efficiency and density gaps are much narrower when compared to air-cooled data centers more fully exploiting industry best practices. Nevertheless, there are other benefits derived from liquid cooling, in combination with density and efficiency capabilities, that can make liquid cooling particularly attractive for some applications.

The Benefits of Liquid Cooling

The ability to support extreme high density has been a differentiator for liquid cooling proponents since some 10-12 years ago. This is when 8kW was pegged as a generally accepted maximum rack threshold for air cooled data centers, prior to the maturation of airflow management as a science. At that time, the liquid cooling banner was carried by solutions that were not even really liquid cooling but which today are more accurately specified as close-coupled cooling – row based cooling and top-of-rack cooling. As more complete airflow management techniques provided a path to 20kW per rack up to near 30kW per rack, the density advantage for liquid cooling subsided a bit. More recently, the folks at Intel have been running an 1100 watts per square foot air-cooled high density data center for a couple years with rack densities up to 43kW. Does that mean there is no density advantage for liquid cooling? Not necessarily. For starters, the Intel folks needed to come up with special servers and racks that allowed them to pack in 43kW of compute into a 60U high reduced footprint and they built the data center in an old chip fab with a high enough ceiling to accommodate the massive volume of supply and exhaust air.2 Secondarily, direct touch liquid cooling solutions now can effectively cool upwards of 60kW per rack footprint and 80-100kW solutions are available, basically waiting for chip sets that will actually stack to that density. Pat McGinn, Vice President of Product Marketing at CoolIT Systems has advised me these configurations are in the works.

But liquid cooling solutions should not be pigeon-holed to that niche of high density somewhere above the 30-40kW that can be air-cooled up to the 80-100kW that can be feasibly configured and deployed. For example, if space is at a premium for both the white space and for the mechanical plant, then 100kW of IT could be packed into two or three racks instead of four to fifteen and the supply water could be cooled through a heat exchanger coupled to the building return side of the chilled water loop. This use of liquid cooling can allow a path for an on-site enterprise data center alternative to colocation where there is, in fact, no space or mechanical facility for a data center. In these cases, density becomes the solution rather than the problem.

Similarly, the latest Intel production data center experiment involves cooling coils built into the roof of hot aisle containment3. Those coils are coupled to a highly efficient adiabatic cooling mechanism, resulting in an annual savings of over 100 million gallons of water versus tower cooling, and with 1100 watts per square foot density achieved with all commodity IT equipment. While the footprint of this adiabatic cooling system consumes around 3X the real estate of a tower and chiller plant, the concept does include a path for smaller data centers. Since Intel is meeting their server inlet temperature targets while allowing the “chilled” water to the coil to get up to 80˚F, this same 330kW coil capacity could be plumbed to the return side of a building chilled water loop, resulting in a moderate sized data center with an air-cooled PUE even less than the 1.06 that Intel is seeing on their 3MW row.

Additional Considerations

Scalability and flexibility are also considerations when evaluating the relative merits of liquid cooling versus air cooling. Phil Hughes, CEO of Clustered Systems Company, cites not needing a custom building and not needing to “re-characterize airflow prior to moves, adds or changes” as some of the less obvious benefits of liquid cooling that can still have an impact on total cost of ownership.

Another consideration would be the relative homogeneity or heterogeneity of IT equipment being deployed in a space. As a general rule of thumb, the economies of scale will benefit liquid cooling solutions with more homogenous IT equipment. Especially if a rather likely customization project is required to configure the IT equipment with individual cold plates on each major heat source or conductors between every heat source and a single large cold plate. An extreme example of this difference might be a colocation data center with rack-level customers versus a research lab data center where all the IT was being used to run simulations on some model, whether that be a new chip design, an intergalactic weather system or a cardiovascular system. The heterogeneity of the colocation space would normally be better served by air cooling with all the disciplines of good airflow management; whereas the research data center might be a more reasonable candidate for liquid cooling.

Liquid, Air, or Hybrid Solution?

Finally, there are subsets of liquid cooling and air cooling that should be considered as part of an overall assessment of cooling alternatives. For air cooling, there is a deployment with very tight airflow management in conjunction with some form of economization and allowances for operating within a wider band of the ASHRAE TC9.9 allowable server inlet envelope and then there is bad air cooling. There really is no defensible reason for bad air cooling. On the liquid cooling side, there are systems that remove all the heat by liquid and there are hybrid systems that remove most of the heat by liquid and require some air cooling for whatever remains. In theory, it would seem that one fully liquid cooling solution makes more sense than absorbing the capital expense for both a liquid cooling solution and an air cooling solution, but that economic analysis needs to be part of basic project due diligence.

In addition, there are situations where hybrid solutions might make sense. For example, if there is an existing data center that is woefully inadequate to the planned future for IT requirements, rather than building an entirely new data center to accommodate business growth, an existing data center at 100 watts per square foot could be converted to 500 watts per square foot, in the same footprint without adding any mechanical facility. Another possibly cost-effective hybrid solution might be a fully integrated product, such as cold plates supplied by return water from an integrated rear door heat exchanger. If that rack rear door heat exchanger required its own mechanical plant, then it might suffer on capital investment versus a fully liquid cooling solution. However, if the rear door heat exchanger itself operated off a building return water loop, then this hybrid approach could make a lot more sense. A final element worth serious consideration would be access to pre-engineered IT equipment configurations or your own comfort zone of working with engineered-to-order IT.

PUE Implications

There is one final caveat to evaluating the appropriateness of air cooling versus liquid cooling for a specific application situation. As noted above, all the real economic benefits require to one degree or another, exploring wider thresholds of the ASHRAE allowable temperature elements or the true server OEM operating temperature specifications. Under those conditions, PUE will not be the final determiner of the difference between liquid cooling and air cooling. When server inlet air exceeds 80˚F, in most cases, the server fans will ramp up and consume a nonlinear increase in energy. That energy increase goes into the divisor of the PUE equation and will result in a higher consumption of energy and a lower PUE; whereas liquid cooling will either eliminate or greatly reduce the fan energy element, thereby lowering the PUE divisor and potential producing a higher PUE while the total energy content is lower. This caveat does not necessarily mean one technology is superior to the other; it is merely another factor that needs to be considered in terms of all the other variables.

Conclusion

In conclusion, everything is better than air cooling with bad or absent airflow management, period. Liquid cooling at this time appears necessary for rack densities 50kW and up. However, liquid cooling should not be restricted to high density applications as it could help overcome a variety of site constraints to air cooling. Rumors of the eminent demise of air cooling are a bit premature, as illustrated by Intel’s successful deployment of 43kW racks with airside economizer. When an air cooled data center can achieve a PUE less than 1.1, straight economics will not always be a significant differentiator between liquid cooling and air cooling. Nevertheless, there are other differentiators included in a full assessment of the mission of the data center, applications and variations of IT equipment planned, building constraints and stage of life of the data center that should reveal a preferred path.

1 Mortensen Construction, “Trends in Data Centers,” Spring 2014, p. 12
2 “Intel IT Redefines the High-Density Data Center: 1100 watts/Sq Ft,” Intel IT Brief, John Musilli and Paul Vaccaro, July 2014
3 “Intel IT: Extremely Energy-Efficient, High-Density Data Centers,” IT@Intel White Paper, Krishnapura, Musilli and Budhai, December 2015
Ian Seaton

Ian Seaton

Data Center Consultant

Let's keep in touch!

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

Subscribe to the Upsite Blog

Follow Upsite

Archives

Cooling Capacity Factor (CCF) Reveals Data Center Savings

Learn the importance of calculating your computer room’s CCF by downloading our free Cooling Capacity Factor white paper.

Pin It on Pinterest