While the debut of the DeepSeek AI model earlier this week sparked a sharp sell-off in U.S. tech stocks, its gains in AI processing efficiency could have big implications for data centers.
Market darling Nvidia shares fell more than 12% and the Nasdaq fell 2.7%, with analysts saying the reaction reflected concerns about whether huge investments in AI and its infrastructure are justified.
Meanwhile, U.S. power and utility stocks fell sharply on reports that DeepSeek’s model raised questions about expectations of an AI-driven surge in data center power demand.
Any shift toward cheaper, more powerful and more energy-efficient algorithms has the potential to significantly expand the scope of AI applications, which could ultimately drive demand for large-scale and distributed data center infrastructure.
“If the reports about DeepSeek are true, this will only drive AI innovation forward,” said Mitch Lenzi, vice president of sales and operations at Baxtel, an online platform dedicated to directories and reviews of managed data centers around the world.
The new model and reduced deployment costs will enable competitors to optimize their own AI strategies, driving demand and adoption, he said.
Lenzi said he believes AI advances like DeepSeek will ultimately accelerate, not slow, data center growth.
“Innovation in AI doesn’t reduce demand, it drives it,” he said. “As AI becomes more pervasive and cost-effective, the industry will continue to expand, maintaining demand for high-performance data center infrastructure.”
Sean Farney, vice president of data center strategy at JLL, agreed that the introduction of more efficient AI models like DeepSeek could reshape the data center market.
“That’s great news for the industry,” Farney said. “If someone finds a cheaper, more efficient way to do AI, it lowers the barrier to entry and makes AI accessible to a wider audience.”
Over time, that will drive increased usage and create new opportunities for data center growth.
Farney noted that AI GPU-focused data centers are already the fastest-growing segment of the market, with a compound annual growth rate (CAGR) of 39%, nearly double the overall data center growth rate of about 20%.
“AI-focused facilities are growing much faster than traditional data centers,” Farney said. “With innovations like DeepSeek, we may see an acceleration in this space.”
The financial implications of this growth are huge: According to Farney, annual spending on infrastructure by major hyperscale data center operators has soared from $200 billion to $300 billion.
“The industry is booming,” he said. “If technologies like DeepSeek make AI applications faster and easier to deploy, we will need more data centers to support this adoption.”
John Dinsdale, chief analyst and research director at Synergy Research Group, noted that it is generative AI (GenAI) that has led to some data center rethinking and re-architecting.
“If technology emerges that can significantly reduce the required power density, this may mean a return to pre-GenAI designs with more traditional cooling and power distribution,” he said.
Dinsdale explained that there is currently considerable investment in GenAI technology and products across the IT ecosystem, and this situation will not change in the short term.
“Will some technology emerge that can reduce the power consumption and cost of training and running AI models? Absolutely,” he said. “It’s the nature of technology development and lifecycles.”
When costs go down and capabilities go up, that tends to spur big increases in adoption and usage.
“Take the growth of cloud computing services over the past 15 years,” Dinsdale said.
The role of modular and edge data centers
Farney also highlighted the growing importance of small, modular, and edge data centers in this evolving environment.
While training large AI models will still require large centralized facilities, the growing focus on AI inference (using trained models to provide real-time insights) is likely to drive demand for distributed, latency-sensitive edge data centers.
“As we move into the inference phase of AI, there is a growing need for localized compute power,” Farney said.
Inference typically requires low latency and proximity to users, which makes smaller edge facilities more practical.
“We may end up covering the globe with small 1- or 2-megawatt data centers dedicated to AI tasks,” he said.
Farney envisions a hybrid future where giant hub data centers and distributed edge facilities coexist to meet the diverse needs of AI workloads.
“This is not a zero-sum game,” he explained. “We will see continued growth in large facilities for batch AI training, and a surge in small data centers for inference and real-time applications.”
The case for data decentralization
Phil Mataras, founder and CEO of AR.IO, a decentralized permanent cloud network provider, said that the current centralized data center approach to storing data