Over the past two years, American tech giants have been racing to build data centers, the “backbone” of AI, but signs of a slowdown are emerging.
Amazon Web Services (AWS), a subsidiary of Amazon, has suspended negotiations on some new data center leases, especially overseas, according to a report by US bank Wells Fargo on April 21. Investment bank TD Cowen said in late March that Microsoft also canceled data center projects expected to use 2 gigawatts of electricity in the US and Europe due to oversupply.
Both companies stressed that they are still holding to their existing agreements and called the move a normal capacity management exercise. Microsoft will still spend $80 billion in its fiscal year ending in June. Last week, Amazon’s vice president of global data centers, Kevin Miller, wrote on LinkedIn that “there have been no major changes to our recent expansion plans.” Nvidia representatives also said that the data center market remains strong.
But while AI development continues, the pace may be changing, according to Quartz. Recent slowdowns suggest that the AI boom may not be as rapid and continuous as Amazon and Microsoft once predicted.

Too much "taking on"
A report from Swiss bank UBS last week suggested that it was unlikely that the cause was a sudden drop in demand. Instead, Microsoft’s move may have stemmed from its “over-commitment” during the initial AI craze.
In 2022-2024, Microsoft leased as much data center capacity as it could. Microsoft’s leased capital expenditures increased 6.7 times in about two years, with the total value of the leases reaching about $175 billion. Now that it has a better understanding of how AI is used and its energy needs, Microsoft is starting to cut back on early-stage projects that don’t provide immediate value.
Cost pressures are mounting in the AI ecosystem. A query for OpenAI’s most advanced models can cost as much as $1,000 in computing power. OpenAI CEO Sam Altman said in January that despite charging $200 a month for premium access to ChatGPT, the service is not yet profitable.
Microsoft CEO Satya Nadella recently admitted that AI has yet to create much measurable value. His comments reflect skepticism about whether generative AI can deliver sustainable returns or whether infrastructure spending is outpacing real needs.
Consumes huge amount of electricity
As CNBC points out, data centers require a lot of electricity for computing power and cooling fans. “New data centers are getting so big that the grid can’t keep up,” said Allan Schurr, chief commercial officer at microgrid developer Enchanted Rock. Three years ago, a large data center had a capacity of 60 megawatts, enough to power 20,000 homes. But now, new centers that support all kinds of AI use require 500 megawatts or more.
This poses a significant challenge for utilities, who must ensure that all customers have enough power, even during peak demand periods. “This is why some utilities impose long connection times for data centers. They need to invest in new substations and may also need to increase transmission and generation capacity, all of which takes time,” Schurr explains.
According to Datacenters.com, data centers use 3% of the world’s electricity. Many areas face grid constraints that make it difficult to build new data centers. Large-scale facilities also face increasing resistance from local communities because they require more power, land, and water.
Research from Georgetown University, Epoch AI, and the RAND Corporation suggests that if current trends continue, each top AI data center could cost $200 billion, house two million AI chips, and require the equivalent of nine nuclear reactors of electricity by 2030.
Impact of tax
US President Donald Trump’s tariff changes are expected to create new cost pressures across the AI and data center supply chain.
“This change will increase hardware costs, impact sourcing strategies and force businesses to reconsider long-term purchasing models,” said John Archer, head of supply chain transformation at US consulting firm Slalom Consulting.
In the short term, cloud and AI providers will need to find ways to reduce costs, such as renegotiating supplier contracts and optimizing inventory. “In the long term, it may be possible to increase geographic diversification, co-manufacturing in tariff-friendly locations, and integrate deeper AI supply chain analytics to adapt to changing trade policies,” Archer said.
One thing that hasn’t changed is that computing is expensive, and AI software and hardware are demanding ever more computing power, said Suresh Venkatesan, CEO of POET Technologies, a Canadian company that develops energy solutions for data centers. “While one data center project is stuck, others may emerge, as there is no sign of connectivity demand slowing down,” he said.
Pankaj Sachdeva, a senior partner at consulting firm McKinsey & Company, said it’s important to distinguish the broader slowdown from some of the recent pauses by major tech companies. Excluding the impact of tariffs, the data center market is expected to grow 20-25% over the next five to seven years, but the growth rate will vary from year to year. “It’s not going to be linear,” Sachdeva said.
Source: vnexpress.net
