How AI is draining the world of its energy sources

AI’s insatiable thirst for resources forces some hard decisions

The technology powering our current industrial revolution needs so much energy and water that data centres are now competing with cities, with the environment and net-zero targets losing out
Pep Boatella
The technology powering our current industrial revolution needs so much energy and water that data centres are now competing with cities, with the environment and net-zero targets losing out

How AI is draining the world of its energy sources

The promise of Generative Artificial Intelligence (GenAI) is so revolutionary, and the dangers of its misuse so catastrophic, that the environmental impact of its infrastructure development has largely been overlooked. That is quickly changing, however, because the technology's growth has highlighted its fast-increasing thirst for power—a concern that now weighs on some of the biggest investment decisions.

It is well-known that AI relies on advanced microchips and semiconductors, which are only produced in a handful of places, but a bigger issue is the vast amounts of electricity it needs, specifically for the continuous operation of AI-driven data centres.

A reliable and sustainable supply of electricity is therefore critical, whether from renewable or traditional sources such as oil and gas. So hungry are these new data centres that they now compete with cities for resources, including water and energy. This demand will only increase as AI integrates into a range of industries and products.

The threat of energy depletion is now serious enough to require regulatory action by lawmakers worldwide, not least to promote climate accountability, given the other major new high-tech drain on resources: cryptocurrency mining.

Insatiable appetite

In January, the International Energy Agency (IEA) published its forecast for global energy consumption for the next two years. It made for stark reading. Considering the electricity consumed by data centres, cryptocurrencies, and AI, the IEA estimated that their combined consumption accounted for 2% of global energy demand in 2022, adding that this would double by 2026.

To put that into context, this energy demand is almost equivalent to Japan's total electricity consumption. To put it mildly, this does not bode well.

The proliferation of AI use and the demand for the cloud computing platforms it runs increase the amount of information to be stored and transferred, which is already vast. The combined effect is to produce a new type of risk. Research from investment bank Goldman Sachs estimates that data centre energy demand will grow by 160% by 2030, doubling carbon dioxide emissions, exacerbating the climate crisis, and undermining progress towards net-zero goals.

Pep Boatella

Trampling the planet

If AI tools were used at the same rate in 2030 as Google search is used today, it would increase energy demand in the United States alone by 7% a year, up from 0.2% between 2010 and 2022, according to a scenario prepared by the consulting firm Bernstein and reported by The Economist.

In their bid to win the AI race, the biggest tech companies, such as Google, Microsoft, and Amazon, recognise that they are largely responsible for creating this energy crisis and its environmental impact.

Google's 2024 environmental report showed a 50% increase in its carbon emissions compared to 2019 due to increased energy consumption in its data centres and supply chains, driven by the rapid advancement and demand for AI. Yet the firm claims its facilities are almost twice as energy-efficient as a typical data centre.

Google is not alone. Last May, Microsoft said its carbon emissions had increased by about 30% since 2020, again driven by data centres. This represents a major global setback in the quest to achieve net-zero emissions by 2030.

Machine learning

One of the fastest-growing areas of energy demand is the machine learning process that underpins GenAI. This process requires huge amounts of energy both to train its algorithms and to generate answers.

Large language models (LLMs) like OpenAI's ChatGPT-3 consume about 1,300 megawatt hours of electricity, enough to power 130 US homes annually and produce nearly 500 tonnes of carbon dioxide.

As technology grows smarter, it grows thirstier. The IEA says a typical search engine query on Google uses 0.3 watt-hours of electricity, while a single query with ChatGPT uses about 2.9 watt-hours of electricity—around ten times more.

Other research suggests that moving to GenAI means consuming up to 40 times more energy for the same task. The IEA adds that if an LLM were integrated into the nine billion searches made daily, electricity demand would increase by ten terawatt-hours per year—roughly the energy consumption of 1.5 million people in Europe.

Diana Estefanía Rubio

Data centre growth

Since 2020, energy efficiency gains in data centres appear to have slowed, leading to an increase in the total energy consumed. Some AI innovations may succeed in increasing computing speed beyond the rate of increase in electricity production and consumption, but doubts remain.

There are around 11,000 data centres worldwide, with many more under construction, so energy providers are having to invest to meet demand. US utilities will need to invest around $50bn in new generation capacity just to support data centres alone.

Northern Virginia is the world's leading data centre location, with 51 million square feet of data centre space. The electricity they need could power 800,000 homes. This puts a significant strain on the power grid and energy infrastructure. Europe, home to about 15% of the world's data centres, fares a little better. Its energy demand is estimated to grow by up to 50% by 2033, driven in large part by data centre expansion.

By 2030, the energy demand of these data centres is expected to be equivalent to the combined energy consumption of Portugal, Greece, and the Netherlands. According to research group Dgtl Infra estimates, global capital expenditure on data centres will top $225bn this year. Nvidia CEO Jensen Huang thinks $1tn worth of data centres will need to be built to support GenAI.

Feeding the beast

Europe, with the oldest electricity grids in the world, needs $861bn in transmission and distribution investment over the next decade to supply power to new data centres. Goldman Sachs says a similar amount is needed for investments in renewable energy. The bank's analysts believe that as energy demand rises to support data centres driven by AI, so too will carbon dioxide emissions, which countries have been trying to reduce to avert the threat of global warming. These emissions could impose a "social cost" of $125-140bn on countries.

Water consumption is another serious problem. Data centre operations generate enormous heat, so clean, fresh water is needed to cool them. This can be a big ask in drought-prone areas, especially if it taps into a population's drinking water. Google and Microsoft alone are estimated to have used 32 billion litres of water in their data centres in 2022. In its 2024 environmental report, Google acknowledged that water use in its data centres had increased by 17% in 2023.

Diana Estefanía Rubio

Foot on the gas

The tech sector's thirst for power is so sizeable that renewable energy is unlikely to be enough to meet the extra demand. Renewables can take time to develop and connect to national grids, their cost can make them unfeasible, and they may be dependent on the weather, which can mean they may be unreliable and subject to disruption.

Given the costs and uncertainty, investors and developers are believed to favour carbons, notably gas. According to Wells Fargo Bank, demand in the US is expected to increase by about 10 billion cubic feet per day by 2030.

This represents a 28% increase over the current 35 billion cubic feet per day used to generate electricity in the US and will require the construction of new pipelines. Goldman's analysis expects natural gas to supply 60% of the power demand growth from AI in the US, with renewables providing the remaining 40%.

Global energy security

The Financial Times reported that producers believe an AI revolution "will usher in a golden era for natural gas" because it is the only cost-efficient energy source capable of providing the 24/7 power required to power the AI boom.

Policymakers hear the arguments. At the Houston Energy Conference in March, there were calls to end the gradual phase-out of oil and gas. This seemed to reflect an acknowledgement of AI's massive additional energy needs. With a growing world population and a global economy set to double by 2045, some predict a 23% rise in energy demand, equivalent to 120 million barrels of oil per day.

While AI's technological leaps are immense, so too are the implications of its energy and water needs and the environmental impact that it leaves behind. In an ideal world, technological advancement and environmental stewardship would go hand in hand, but with AI, it seems they do not.

Hard decisions lie ahead.

font change

Related Articles