Why has PUE Remained Flat for So Long After Years of Progress?17 min read

by | Sep 13, 2023 | Blog

Global averages from Statista and the Uptime Institute put power usage effectiveness (PUE) at 1.55 in 2022. The latest survey from Uptime Institute raised that number to 1.58 for 2023.

“PUE has been relatively flat for the last six years after falling dramatically between 2007 and 2018,” said Andy Lawrence, Executive Director of Research, Uptime Institute.

He said that industrywide PUE was as high as 2.5 in 2007. The next few years saw a steady decline all the way to 1.65 in 2014. Between 2014 and 2018 there were incremental gains, bringing PUE averages down below 1.6. Since then, it has hung stubbornly between 1.55 and 1.58.

Chris Brown, CTO of the Uptime Institute said that the big drop from its high of two decades ago was largely due to manufacturers working hard to increase the efficiency of their equipment. This was an industrywide push from vendors and data centers to be better global citizens by addressing what appeared to be an insatiable desire for more power.

“Data center power consumption increased as much as 90% in the 2000-2005 period, slowing to 24% in the 2005-2010 period,” said Brad Johns, an analyst for Brad Johns Consulting. “Further, data centers consumed 1.8% of all power in the United States.”

Upon learning of such shocking statistics, data center managers and the vendor community rallied. Power requirements only grew about 5% during the 2010-2020 decade despite the rapid expansion of cloud computing, analytics, IoT, video and streaming. Factors contributing to improving data center energy efficiency included more cores per server, server virtualization and higher capacity hard disk drives (HDDs).

“Static UPS efficiency is now near that of rotary UPS and operations personnel are doing a better job of matching spinning capacity to the demand on the data floor,” said Brown.

Why the Stall?

Yet PUE has stalled. Uptime Institute research indicates that the early PUE gains represented the low hanging fruit. Vendors and operators were able to make simple yet effective changes: Better fans, better capacity management, basic monitoring of temperature and cooling, as well as refreshing aging gear with newer equipment that met more stringent efficiency standards. All of this brought down electricity consumption sharply and improved the economics of the data center – in addition to the obvious sustainability and environmental responsibility gains.

To get PUE much lower will require far more investment for potentially lower rewards.

“Owners and operators are not managing based on a single variable like PUE; they are managing the entire data center and take into account many metrics,” said Brown. “To achieve a further drop in PUE means active systems that actually modulate equipment based on rack inlet temps, hot spots, cold spots, weather and other factors.”

He explained why there may be reticence in making big capital investments to take PUE to the next level. As well as upfront costs, even more efficient gear will probably raise operating costs – and it introduces risk. Active systems for power management and cooling could become faulty and shut down key equipment. Expensive liquid cooling arrays that leak or flood could lead to catastrophic losses.

Basic economics, then, stands in the way of moving the PUE average down to 1.4. Data centers must balance the cost of doing so with what it might mean to the electricity bill, operating and capital costs.

“To date, more expensive technologies haven’t won out,” said Brown “We expect PUE to stay flat until active AI technologies for cooling mature and there are more installations available that people can see.”

AI, liquid cooling, and automation systems for power consumption are all available. The hyperscalers, for example, tend to harness more of such gear. According to the Uptime Institute survey, 16% of data centers now report PUEs under 1.3. Most are in Europe and North America and largely belong to hyperscalers and large-scale colos. The industry may stand back and admire them for their innovation, but they are waiting for lower prices and lower risk before they forge ahead.

Government Intervention

The wild card in all this is government. Should government intervention ramp up to achieve climate or energy efficiency goals, data centers would be under pressure to institute PUE-reducing measures regardless of cost. Meanwhile some cities and regions are stiffening permitting requirements for new data center builds and data center expansion. A super-low PUE may become the only way to gain approval in such areas.

Brown wonders, too, if PUE could actually worsen over the next couple of years due to the popularity of high-density racks. And if higher temperatures become commonplace in previously cool regions, PUE could also be impacted.

“Racks density could drive PUE up for a while, which would speed the transition from air to liquid cooling, which would eventually bring PUE back down,” said Brown. “Further gains in the near-term may depend on new investment and the retirement of legacy facilities.”

Setting the standard for rack power reliability.

With automated soldering from line input to each receptacle,
PowerLok® eliminates all mechanical connections, making it 270%
less likely to fail than rack PDUs with mechanical terminations.

Setting the standard for rack power reliability.

With automated soldering from line input to each receptacle,
PowerLok® eliminates all mechanical connections, making it 270%
less likely to fail than rack PDUs with mechanical terminations.

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

Drew Robb

Drew Robb

Writing and Editing Consultant and Contractor

Drew Robb has been a full-time professional writer and editor for more than twenty years. He currently works freelance for a number of IT publications, including eSecurity Planet and CIO Insight. He is also the editor-in-chief of an international engineering magazine.

Subscribe to the Upsite Blog

Follow Upsite

Archives

Cooling Capacity Factor (CCF) Reveals Data Center Savings

Learn the importance of calculating your computer room’s CCF by downloading our free Cooling Capacity Factor white paper.

Pin It on Pinterest