In the Absence of a Liquid Cooling Supply Chain, Air Cooling Becomes All the More Vital17 min read

by | Feb 5, 2026 | Blog

Back in 2012, I splurged on the latest and greatest Sony Vaio laptop: the largest available solid state drives when they were new and exciting, a carbon fiber casing, and the best processors and memory. I had never spent so much on a laptop. And I loved it, until a hinge broke about a month after the initial warranty ran out.

As it was a bleeding-edge kind of product, the supply chain was not in place for the part I needed. My computer repair technician finally found a replacement hinge in Denmark at high cost. A few months later, the other hinge broke. That was the end of my prized laptop.

The point of this story is to demonstrate:

  • How fragile supply chains can be when you are dealing with next-generation products like GPUs, AI servers, liquid cooling, Direct-to-Chip (DtC) cold plates, and related equipment.
  • The problems inherent in bringing a new technology to market. There may be aspects of it that are not quite ready for prime time.

After all, the entire liquid cooling landscape is quite new. Yes, liquid in one form or another has been around for decades. But the current wave of excitement, driven by GPUs and new AI factories, remains at the early adopter stage and is just now trying to ramp up for a mass audience. The supply chains for massive scale are not there yet.

Supply scarcity starts with the Graphics Processing Unit (GPU). Nvidia has an order backlog that adds up to $300 billion or more for AI chips. Due to the huge orders already placed by hyperscalers and large-scale data center developers, you may have to wait a year or two to get the GPUs you urgently need. To make matters worse, Nvidia suspects its backlog may grow in the coming year.

How about DtC, which is composed of several elements? Coolant distribution units (CDUs) distribute water or some other fluid coolant to computer systems and chips. Some units can sit on top of or beside the rack. Larger-scale systems have CDUs that run liquid cooling lines to entire rows of IT gear. Large manifolds link the CDU to a network of pipes and cooling loops that take cold water or other fluids to the chips and return hot liquid to the CDU to be cooled. Then there are the cold plates, typically made of aluminum or copper due to their excellent heat conductivity. They act as the heat exchange interface between the hot GPU and the cooling liquid.

According to analyst firm Omdia, CDU manufacturing is expected to increase by more than six times between 2023 and 2027. That is a leap from about $150 million in revenue to $1 billion by 2027. During that same short interval, cold plate revenue is forecast to rise from less than $200 million to about $1.4 billion. Revenue from manifolds and related equipment, including fittings, piping, and valves, is projected to increase from about $250 million to almost $1.8 billion. That is an incredible rate of expansion for any new technology and one that is certain to uncover teething troubles, quality challenges, and order backlogs. Omdia says cold plate unit volume has been ramping up from a little over a million units a year and is expected to reach 4 million by the end of 2025. If the industry meets that target, it will be a monumental achievement. But within three years, demand from manufacturers is expected to exceed nine million units. The big cooling OEMs are certainly ramping up. But how likely are they to keep up with AI demand? Time will tell.

The Strategic Role of Air Cooling

DtC is being relied upon to keep chips cool. But the supply chain may not be able to deliver at the expected volume. What is likely to happen, therefore, is a reassessment of cooling strategies. DtC will be deployed broadly. But a hybrid cooling arrangement could evolve, where developers and data center designers configure facilities to optimize their cooling needs: certain parts of the data center packed with DtC, while other areas are heavily augmented by the latest in efficient air-based cooling.

Omdia placed the threshold between air cooling and DtC at roughly 80 to 100 watts per square centimeter. In the event of a shortage or long lead times for liquid cooling components, expect data centers to maximize the cooling potential of their CRAC and CRAH units using airflow management best practices to cope with the excessive heat generated by AI workloads and GPUs.

Affordability comes into the equation, too. Economics will dictate which workloads justify heavy liquid cooling investment and which can be supported by air cooling or a hybrid arrangement of air and liquid.

The industry's easiest to install containment!

AisleLok® solutions are designed to enhance airflow management,
improve cooling efficiency and reduce energy costs.

The industry's easiest to install containment!

AisleLok® solutions are designed to enhance airflow management,
improve cooling efficiency and reduce energy costs.

Drew Robb

Drew Robb

Writing and Editing Consultant and Contractor

Drew Robb has been a full-time professional writer and editor for more than twenty years. He currently works freelance for a number of IT publications, including eSecurity Planet and CIO Insight. He is also the editor-in-chief of an international engineering magazine.

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

Subscribe to the Upsite Blog

Follow Upsite

Archives

Cooling Capacity Factor (CCF) Reveals Data Center Savings

Learn the importance of calculating your computer room’s CCF by downloading our free Cooling Capacity Factor white paper.

Pin It on Pinterest