AI Enablement Versus Data Center Energy Management18 min read

by | Feb 7, 2024 | Blog

Demand to satisfy artificial intelligence (AI) and high-performance computing workloads can often be in conflict with pressure to reduce energy consumption, costs and greenhouse gas emissions (GHG) in the data center. AI, especially generative AI, exerts intense downstream impacts on data center densities and power demands. 

“AI and its downstream impact on data center densities and power demands have become the dominant storylines in our industry,” said Vertiv CEO Giordano Albertazzi. “Finding ways to support the demand for AI and reduce energy consumption and GHGs is a significant challenge requiring collaboration between data centers, chip and server manufacturers, and infrastructure providers.”

According to a report, “Drivers of Change: Meeting the Energy and Data Challenges of AI Adoption,” 88% of those who adopted AI saw a dramatic surge in the need for computing power. 74% confirmed that AI required or will require significant upgrades or a complete overhaul of their data center infrastructure. Further, 73% of those who have implemented AI were not prepared for the energy requirements.

“Planning for change and ensuring flexibility are key to navigating AI adoption,” said Rob Lee, Chief Technology Officer, Pure Storage. “As power and data demands increase exponentially in the age of AI, investing in and deploying the right AI-ready data infrastructure is not only essential to effective deployment and energy efficiency, but to driving the most value out of AI projects.”

Conflict of Interest

The enthusiasm to accommodate AI workloads in the data center, however, often lies on a collision course with environmental, social and governance (ESG) goals to decarbonize and become more energy efficient. According to the Pure Storage survey, 89% of IT and data center managers expressed concern about AI power requirements impacting their ability to meet their ESG goals. 88% agreed that meeting ESG goals would be impossible without properly preparing their data center infrastructure to support AI initiatives. And among those whose data center infrastructure needed or will need a complete overhaul to support AI initiatives, 62% are under a great deal of pressure to reduce their company’s carbon footprint. 

Watch out for Shadow IT

How can these competing trends be reconciled? The first point to grasp is that attempts to ignore AI and failure to adopt policy to deal with generative AI are doomed strategies. In fact, they are likely to give rise to a new wave of shadow IT. Why? Traditional concerns around shadow IT revolved primarily around cost-control. With the unsanctioned use of generative AI services rapidly growing within the enterprise, that risk has now expanded to exposure of intellectual property and customer data being exposed outside of your organization.

“Generative AI will become the largest proliferator of shadow IT,” said Heath Thompson, President & GM, Quest Software.

AI Dictates New Builds and Retrofits 

If ignoring AI isn’t an option, then data centers are urged to embrace AI – whether there is a concerted drive to host AI workloads in the data center, or simply because users will generative AI apps whether they are encouraged to do so or not.

“Surging demand for artificial intelligence across applications is pressuring organizations to make significant changes to their operations,” said Albertazzi.

However, many older data centers are ill-equipped to support widespread implementation of the high-density computing required for AI. Many lack the required infrastructure, space, or budget for liquid cooling. Hence, Albertazzi predicts that more organizations are going opt for new construction or bring in prefabricated modular units that shorten deployment timelines. Others will schedule large-scale retrofits that fundamentally alter their power and cooling infrastructure of the data center. No matter how the data center aims to address AI, planning should embrace a combination of more powerful compute capabilities and fundamental changes to energy usage and environmental impact. The right approach can achieve both outcomes.

“Such changes present opportunities to implement more eco-friendly technologies and practices, including liquid cooling for AI servers, applied in concert with air cooled thermal management to support the entire data center space,” said Albertazzi.

Prioritize Environmental Impact

It is possible to reconcile environmental impact and compute power upgrades. It requires that those in change of data center purchasing are willing to cast a friendly eye on purchase orders for equipment and applications that help measure and reduce emissions or energy usage. Cisco Webex Control Hub, for example, offers ways to measure and analyze the carbon emissions of the data center.

“The Nexus Dashboard provides real-time and historical insights into the energy consumption and energy cost of GHG emissions of Cisco Nexus switches and other IT equipment’s in the data center,” said Aruna Ravichandran, SVP of Webex by Cisco. “This helps in forecasting and optimizing emissions for the networking environment.

Explore New Power Schemes

Many data centers will be forced to generate enough compute power to run AI workloads within the same fixed space. Bill Estes, GM, Anderson Power, said one solution might be to reevaluate how power is being used in the data center.

“Supporting a higher current is helpful if you’re limited to a central office power line that tends to have lower voltages,” said Estes. “It’s a way to bring in more power, but there are limits, including size. You can only run so much copper through an opening before you overheat the connector.”

Change power connectors


Another way to increase power in a limited footprint is to switch to the latest power connectors. Traditional connectors of electrical currents such as C13/C14 connectors pull 10V and 15A. The latest generation, however, offer up to 7X the power in the same footprint, delivering up to 400V AC or DC at up to 30 A and supporting power density according to Estes (Anderson Power’s Saf-D-Grid connector is one example).

Head North

The intersection of the demand for artificial intelligence (AI) and the pressure to reduce energy consumption, costs, and GHGs poses a significant challenge. To reconcile these conflicting areas, a long-term solution involves developing a more efficient model for powering data centers.

“Strategic placement of data centers in naturally cold environments in the northern hemispheres or near water sources has been a recent trend until data center cooling efficiency can be improved,” said Estes.  

 

Real-time monitoring, data-driven optimization.

Immersive software, innovative sensors and expert thermal services to monitor,
manage, and maximize the power and cooling infrastructure for critical
data center environments.

 

Real-time monitoring, data-driven optimization.

Immersive software, innovative sensors and expert thermal services to monitor, manage, and maximize the power and cooling infrastructure for critical data center environments.

Drew Robb

Drew Robb

Writing and Editing Consultant and Contractor

Drew Robb has been a full-time professional writer and editor for more than twenty years. He currently works freelance for a number of IT publications, including eSecurity Planet and CIO Insight. He is also the editor-in-chief of an international engineering magazine.

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

Subscribe to the Upsite Blog

Follow Upsite

Archives

Cooling Capacity Factor (CCF) Reveals Data Center Savings

Learn the importance of calculating your computer room’s CCF by downloading our free Cooling Capacity Factor white paper.

Pin It on Pinterest