In the era of strong digitalization, artificial intelligence (AI) is becoming the foundation of technological innovation and global economic growth. From large language models such as ChatGPT, Claude, Gemini to real-time AI applications in healthcare, transportation or smart manufacturing, AI is increasingly playing an essential role in life and production. Behind that development is the increasing demand for computing infrastructure, focusing on data centers serving AI. However, the rapid expansion of AI data centers also poses serious energy challenges, as forecasts show that by 2030, they may consume four times more electricity than today.
Currently, data centers worldwide consume about 460 terawatt-hours (TWh) of electricity per year, equivalent to about 1.5–2% of total global electricity consumption. Of these, AI-dedicated data centers account for an increasingly significant proportion, due to their nature of handling huge computational volumes. Deep learning models, especially large language models, need to be trained on millions of GPU hours and consume gigawatt-hours (GWh) of electricity. The deployment of complex AI systems is not only taking place in large technology centers but is also spreading to factories, manufacturing facilities, and smart infrastructure networks, causing uncontrolled energy consumption.
According to forecasts from the International Energy Agency (IEA) and many independent research institutes, by 2030, the electricity consumption of global data centers could exceed 2,000 TWh per year, accounting for about 4% of the total global electricity. Of which, AI data centers are identified as a major contributor to this growth. Part of the reason is that the speed of AI models is developing more and more rapidly, the model size increases exponentially, from GPT-2 with only a few hundred million parameters to GPT-3 with 175 billion, and then new models that far exceed this threshold. Each time such a model is retrained, it can consume hundreds of tons of CO₂, if using fossil fuels.

The spread of real-time AI on the edge further complicates the energy picture. Applications such as smart surveillance cameras, autonomous vehicles, industrial robots, and AI-powered IoT devices all require fast, on-site computing. This means a distributed system with millions of small nodes operating at the same time, contributing to the increase in total electricity consumption on a global scale.
The impact of this surge in electricity is not limited to statistics. National power systems will be under greater pressure to ensure stable supply for both manufacturing, residential use and data centers. Some regions such as Texas (USA), Frankfurt (Germany) or Guangdong (China) have recorded local power shortages due to large data centers coming into operation at the same time. On the other hand, if the majority of electricity still comes from fossil fuels, the AI boom could derail global carbon emission reduction commitments, affecting climate change goals.
From an economic perspective, electricity costs account for a significant portion of the total operating costs of a data center. As demand for electricity increases rapidly, electricity prices can be pushed up, affecting not only the technology industry but also putting pressure on residents and other industries. This raises big questions about the fairness of access to energy resources as well as the direction of technology infrastructure development.
In this context, solutions to reduce power consumption are being strongly promoted. One important direction is to develop more energy-efficient AI hardware, such as specialized TPU chips, ASICs or high-performance GPU lines with better processing capacity per watt. In addition, new data centers are being designed with advanced cooling systems such as liquid cooling, immersion cooling or deployed in naturally cold climates to save cooling power.
Many tech companies are also moving their data centers to areas with abundant renewable energy sources, such as Northern Europe and Canada, or using solar power in sunny areas such as the Middle East. Some companies are even developing AI to… save electricity for the AI infrastructure itself, by automatically monitoring, distributing processing loads, and optimizing computing schedules to be the most energy-efficient.

In Vietnam, this issue is also becoming increasingly urgent as technology companies begin to invest in large-scale AI data centers. A typical example is the Digital HUB Data Center (DCH) project in Ba Ria – Vung Tau, expected to start construction at the end of 2025. Without planning along with appropriate power infrastructure and renewable energy policies, Vietnam may face many operational, cost and environmental risks when entering the global AI race.
Clearly, AI data centers are a symbol of technological progress, but they are also “power-hungry giants” that need to be tightly controlled. The energy problem is no longer a side issue but has become a central factor in countries’ AI development strategies. Synchronizing technology investment, power planning, renewable energy development, and supporting policy development is a prerequisite to ensure sustainable AI development without putting excessive pressure on the global energy ecosystem.
