Does DeepSeek Solve the Problem of How to Power the AI Boom?17 min read
There has been a lot of talk over the past year about how much power will be needed to power AI-based data centers and serve generative AI (GenAI) applications like ChatGPT. Most estimates said the grid would be scores of GW short by 2030. Panic ensued about how new data centers could possibly find the energy they needed to operate.
And then along came DeepSeek. This Chinese AI app promised to reduce AI power consumption from GenAI by more than 90%.
“DeepSeek represents a significant advancement in natural language processing, with enhanced contextual understanding, faster model training capabilities, and improved multilingual proficiency that differentiate it from existing AI models,” said Dylan Kane, Managing Director at real estate firm Colliers. “Notably, these improvements were achieved with greater efficiency and lower costs, challenging previous assumptions about the resource intensity required to develop cutting-edge AI models.”
Does this mean that all the projections about the lack of power available for data centers are now solved? Does this development mean that the predictions for data center power growth for the rest of the decade are invalid – or at least need to be severely downgraded?
There is certainly some short-term fallout from DeepSeek. Companies like OpenAI, Meta, Google, Microsoft, and Amazon are scrambling to catch up fast. But the consensus is that the Chinese announcement is a good thing. It forces other AI developers to find ways to be more energy efficient. That may push back the dire predictions about data centers running out of power by a year or two, but most experts don’t think the big picture will change much.
“Even if the energy consumption rates of DeepSeek prove out broadly, it might impact the timeline for data center capacity predictions but not the amount,” said Dan Brouillette, 15th US Energy Secretary during President Trump’s first term.
AI Adoption Won’t Slow
If anything, developments like DeepSeek and subsequent breakthroughs on energy efficiency from others won’t diminish AI traffic on the web. In fact, it is likely to increase markedly. Consider what happens when you build new roads and freeways – after an initial improvement, traffic eventually gets worse. Why? A theory called induced demand says increasing the supply of something makes people want it more. Two economists (Matthew Turner of the University of Toronto and Gilles Duranton of the University of Pennsylvania) studied this phenomenon around the U.S. between 1980 and 2000. They concluded that more and better roads make traffic congestion worse.
According to the AFCOM State of the Data Center Report 2025, AI adoption is already soaring. The report reveals that 80% of respondents anticipate significant
increases in capacity requirements due to AI workloads. 64% are actively deploying AI-capable solutions in their facilities. Those numbers are up almost 10% on the previous year and are going to continue to rise steadily over the next couple of years.
“80% of respondents (a dramatic rise from 53% last year) believe that new AI workloads, such as generative AI, will drive increased capacity requirements for the colocation industry, with 40% anticipating a significant increase,” said Bill Kleyman, author of the report and CEO of Apolo.
Faster AI engines, less energy intensive AI applications, more powerful AI chips – we can expect developments at breakneck speed on all fronts. But due to induced demand, it will add up to more traffic and the need for a lot more power for new and existing data centers.
Latency Now, Latency Later?
There is one factor that few are considering. Jeffrey Tapley, Chief Operating Officer of Digital Realty, explained that the large language models (LLMs) being developed currently to train AI applications and increase their accuracy are power hogs. Some crunch through hundreds of billions of data points and then come up with a conclusion in near real time. If the big data center providers build data centers dedicated to LLMs and AI applications, they will have to put them very close to the bulk of users to eliminate latency. They must also be supported by massive amounts of power.
Once the LLMs are set and the learnings are well in hand, power demand will be much lower – down to about a seventh according to some estimates. He gave the example of a 1 GW site Digital Realty is building near Dulles airport for the Washington D.C. area. It needs lots of power, but it is struggling to obtain all of it. One solution is natural gas generation, but the nearest gas line is more than six miles away and will need to obtain many property easements. If the company spends a lot on that pipe and then AI demand goes away, the company incurs a big loss.
“If you collocate a data center for AI right now, it might not be needed in seven years,” said Tapley. “Instead of AI, which may or may not take off as expected, we continue to focus on metro areas that are already strong or getting stronger.”
The industry's easiest to install containment!
AisleLok® solutions are designed to enhance airflow management,
improve cooling efficiency and reduce energy costs.
The industry's easiest to install containment!
AisleLok® solutions are designed to enhance airflow management,
improve cooling efficiency and reduce energy costs.
Drew Robb
Writing and Editing Consultant and Contractor
0 Comments