What Stands in the Way of Hybrid Cooling Success?18 min read

by | Apr 24, 2024 | Blog

Liquid cooling gets all the attention. Yet air cooling remains by far the biggest contributor to data center cooling. This will remain the case for some time to come. According to the Uptime Institute’s Cooling Systems Survey of 2023, there has been a big shift in perceptions about the growth of liquid cooling. In 2021, 13% of data center managers believed that liquid cooling would become the primary cooling method within 1-3 years and 35% felt it would take 4-6 years. Both numbers have risen (16% and 41% respectively). This indicates that more and more data center managers are dismissing the hype around liquid cooling and now believe that air cooling will persist for many years to come.

What is emerging is a hybrid cooling approach wherein most data centers will include some liquid cooling in conjunction with traditional air-cooling systems. Some will use only a little, others a whole lot. But no one is going to eradicate air cooling entirely.

“In reality, there is no such thing as 100% liquid cooling; there will always be some air cooling,” said Lars Strong, Senior Engineer and Company Science Officer, Upsite Technologies. “A hybrid approach could be only within the computing enclosure, within the cabinet or within the computer room as a whole.”

But hybrid cooling won’t be easy. Data center managers have decades of familiarity with air cooling techniques. They face many challenges in smoothly integrating liquid cooling into their operations.

IT vs Facilities

IT and facilities personnel have distinctly different skillsets. Sometimes people on one side lack appreciation of the needs of the other.

“IT and facilities have traditionally not collaborated well,” said Strong.

This can get in the way of power usage effectiveness (PUE) at times. One example is a data center that had difficulty getting perforated floor tiles placed correctly, resulting in cooling inefficiency and unnecessary hotspots.

On the other side of the coin, IT can occasionally demand server and processing densities that the power and cooling infrastructure can’t support. Or someone adds switches with fans that blow hot air into the cold aisle.

This IT/facilities divide has improved over the last decade. But liquid cooling changes the equation. It demands far more coordination, cooperation, and mutual action for it to succeed.

Plumbing needs, for example, escalate. The more liquid that is to be included, the greater the number of plumbing lines will sit above and around computer equipment. Both sides need to work together to find the best ways to add those pipes. If they are going above the aisles, they need to be introduced in a sensible way, one that doesn’t disrupt the traditional wiring trays and pathways. The same holds true for underfloor plumbing. This requires advanced planning and mutual agreement on a strategy and approach to plumbing additions. 

Safety is another concern. IT needs reassured that its valuable hardware won’t be exposed to leaks. And facilities must develop protocols to bring about fast action during leakage emergencies.

“It’s absolutely critical that the teams work together to support data centers and other HPC applications which are more and more the heart of the business,” said Simon Brady, Liquid Cooling Product Manager, Vertiv. “That means understanding and recognizing the opportunities and challenges on both sides.

In addition, these HPC and AI applications draw more power, meaning strengthening and updating the power train throughout. The power and cooling infrastructure also require consideration in the building of data centers, with increased weight on the floor and the ceilings for water delivery and much larger cables to deliver the required power.

“The C-suite can help to support this relationship, with leadership highlighting the importance and the urgency of the critical data and the timelines required to make the company competitive,” said Brady. “Experienced suppliers can also be a source for smoother transitions in planning and working across multiple teams.”

Education is Essential

Both sides need to be educated in their own roles as regards liquid cooling systems. But they also need to be given basic knowledge of the skills of their counterparts. IT may not need to become skilled plumbers, but they should at least know how to shut off the water in a rack a row or for the entire data center. Similarly, facilities should be educated on the best ways to shut off IT systems in an emergency without causing data loss or impacting customers.

“This industry has been hindered by the silos that grew out of budget division, time, and other necessities,” said  Carrie Goetz, Principal/CTO at StrategITcom and author of Books such as Jumpstart Your Career in Data Centers. “The best organizations work across budget silos with all parties involved. “Education and knowledge are the perfect sunlight in this scenario.”

Colocation Divide

Unfortunately, the tendency for businesses to collocate some of their IT assets has widened the IT/facilities gap in some data centers. IT personnel can become used to leaving most of the heavy lifting and facilities work to colos. If this is the case, they are somewhat divorced from the realities of power distribution, cooling mechanics, and space planning. If they then decide to add liquid cooling to the existing data center, they may be naïve about what is involved and how they need to interact with facilities personnel.

The solution here is to get more involved with colos and ask them to show you the facilities set up as well as the servers and racks. That at least provides some education on how things work in the real world.

Conclusion

Some may decide that they can ignore liquid cooling for now and continue as before. After all, Uptime Institute figures show that liquid cooling is only being implemented in 11% of data centers. Of those, 6.8% are doing water cooled cold plates, 4.2% full immersion, 1.4% dielectric-cooled cold plates, and 0.3% using partial immersion. But those figures are rapidly shifting. Hybrid cooling is coming. Strong’s advice is for IT and facilities to make an effort to get to know each other better.

“Liquid cooling demands tighter cooperation,” he said. “Begin by introducing water cooling carefully and gradually to gain experience with it.” 

Strong suggested that data center managers closely follow what large colos and hyperscalers are doing to learn from their mistakes and successes as they iron out the kinks of deployment and maintenance. And take the opportunity to tour their facilities if possible.

The industry's first and only tool-less containment solution!

AisleLok® is the industry’s first modular containment solution,
proven to provide the core benefits of containment with greater flexibility and value.

The industry's first and only tool-less containment solution!

AisleLok® is the industry’s first modular containment solution,
proven to provide the core benefits of containment with greater flexibility and value.

Drew Robb

Drew Robb

Writing and Editing Consultant and Contractor

Drew Robb has been a full-time professional writer and editor for more than twenty years. He currently works freelance for a number of IT publications, including eSecurity Planet and CIO Insight. He is also the editor-in-chief of an international engineering magazine.

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

Subscribe to the Upsite Blog

Follow Upsite

Archives

Cooling Capacity Factor (CCF) Reveals Data Center Savings

Learn the importance of calculating your computer room’s CCF by downloading our free Cooling Capacity Factor white paper.

Pin It on Pinterest