When to Retrofit the Data Center to Accommodate AI, and When Not to18 min read

by | Mar 6, 2024 | Blog

Legacy facilities are often ill-equipped to support widespread implementation of the high-density computing required for artificial intelligence (AI). Many lack the required infrastructure for liquid cooling and denser data centers. Where it makes sense, new construction is the ideal solution. Otherwise, those wishing to take advantage of this new market will be forced to retrofit some AI capabilities into their existing footprint. That could turn out to be a smart strategy for some and a bad idea for others.

Let’s take a look at who should forge ahead with AI-based retrofits and who should think twice. Here are some factors to consider:

Budget

Denser racks, liquid cooling, more power, planning and renovations/new construction – it all costs money. Many won’t be able to justify it economically. After all, AI is all about potential. But how many data centers have the existing AI demand, or contracts signed for AI workloads to cover the cost of a retrofit?

The good news is that budgets may be freeing up a little. According to Omdia, 74% of enterprises raised their IT budgets over the course of 2023. Less than 5% decreased their IT spending. This is being fueled by increased spending on cybersecurity, as well as AI. Omdia noted that many enterprises are moving GenAI solutions from evaluation and lab stages into production environments. That means more cash will be made available for data center investment in some cases.

Risk

There is always a risk when introducing new capabilities. Those considering retrofitting data centers to accommodate AI should base the decision on how much risk it poses to their specific industry. In industries such as finance, risk management, and pharmaceuticals, where advanced capabilities are essential, the need for AI tools is more likely to become crucial in the years ahead.

“The cost and effort involved in retrofitting might outweigh the benefits for businesses that do not have the scale or resources to fully leverage AI capabilities,” said David Beach, Datacom Market Segment Manager, Anderson Power. 

Management Backing

C-suite executives are getting hit by a barrage of hype about AI. Some data centers may be able to capitalize on this to obtain funding for much needed upgrades. Whether it is more power and cooling, denser racks, or more GPUs, purchase orders that voice an AI use case may fall upon friendlier eyes. Management interest in data center infrastructure is likely to remain high due to the ongoing AI-hype storm.

Size Matters

Smaller data centers are likely to be at a disadvantage unless they have a business model already tailored to high-performance computing (HPC) and demanding workloads. Omdia believes that colocation businesses, including both multi-tenant and single-tenant data center providers, are expected to be the ones riding this wave of new AI growth. Some have already adapted their data center designs to enable higher rack power density.

“The data center market has a heightened awareness of practical applications for AI that promise to improve productivity and lower costs,” said Alan Howard, Principal Analyst at Omdia. “The colocation providers able to provide the highest rack densities and access to liquid cooling will now have the upper hand in the market for data center space.”

As a consequence, the colocation industry is expected to reach be worth $65.2 billion by 2027, with a 5-year annual growth rate of 9.4%, according to Omdia’s Colocation Services Tracker. Howard added that not all data centers can handle AI or HPC equipment, whereas hyperscalers and large colos have been anticipating this emerging AI-fueled growth trend.

Cooling Overhaul

Data center cooling overhauls are generally complex and expensive. Those wishing to beef up their racks to accommodate AI workloads may be forced to consider direct-to-chip liquid cooling or even immersion cooling in some cases. This requires special data center plumbing designs and even complete reconfiguration of aisles.

“Achieving these advanced data center operating characteristics are not for the faint of heart, or those companies with an aversion to high capital expenditures,” said Howard.

Angela Taylor, Director of Global Strategy of LiquidStack, added that high demand for generative AI justifies investment in liquid cooling for those facilities in a position to add more power and boost data center density.

“Nearly every data center owner and operator is under pressure to make the necessary changes to enable AI,” said Taylor. “Those who are not planning to incorporate generative AI and liquid cooling into their offerings will be unable to compete with those who are taking those steps to upgrade their existing and future infrastructure.”  

Start Small

For many data centers, the expense required to get into the AI business may be daunting. Unless the business is clearly there, the best approach for many is to start small. One approach might to take the densest rack currently in the data center and upgrade it with all the processing power and memory it can take. Additional cooling for this one rack may be manageable for some.

The original design of the data center, though, could well be the deciding factor. Most data centers are designed around an average IT power load per rack, say 5 to 10 kW.  This means that everything upstream from the IT rack, including PDUs, power cabling, breakers, low and medium voltage switchgear, UPSs and even transformers, are sized to accommodate that amount of power. Because AI workloads can easily drive IT rack densities to 50 kW per rack and higher, data centers with inadequate power infrastructure should refrain from retrofitting.

Those that are space or power constrained can get around this to some degree courtesy of modularization. Modular units housed inside shipping containers can be deployed in the parking lot beside the current facility and used for AI and HPC workloads to test the market or satisfy one or two initial clients. If that proves lucrative, more modules or the addition of another crop of dense racks can be carried out with more confidence.

“Data centers can deploy modular containers with specialized AI cooling and infrastructure, and place them adjacent to their building, such as in their parking lot or on their rooftop,” said Taylor. “This allows for the expansion of AI capabilities without encroaching on the limited internal space or disrupting current business.”

Real-time monitoring, data-driven optimization.

Immersive software, innovative sensors and expert thermal services to monitor,
manage, and maximize the power and cooling infrastructure for critical
data center environments.

 

Real-time monitoring, data-driven optimization.

Immersive software, innovative sensors and expert thermal services to monitor, manage, and maximize the power and cooling infrastructure for critical data center environments.

Drew Robb

Drew Robb

Writing and Editing Consultant and Contractor

Drew Robb has been a full-time professional writer and editor for more than twenty years. He currently works freelance for a number of IT publications, including eSecurity Planet and CIO Insight. He is also the editor-in-chief of an international engineering magazine.

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

Subscribe to the Upsite Blog

Follow Upsite

Archives

Cooling Capacity Factor (CCF) Reveals Data Center Savings

Learn the importance of calculating your computer room’s CCF by downloading our free Cooling Capacity Factor white paper.

Pin It on Pinterest