Artificial intelligence is developing faster than renewable energy. So, while individuals are asked to turn down the heating, GAFAM are reviving nuclear power. The International Energy Agency is convening a world congress on artificial intelligence and energy on December 4 and 5, 2024.
The US nuclear power plant “Three Mile Island” is infamous for having been the scene of one of the worst nuclear accidents in history in 1979, and it will soon be back in service to power Microsoft’s artificial intelligence (AI) systems. This announcement, made in September 2024 and which concerns a reactor independent of the one that caused the 1979 accident, is part of a broader trend of massive investments by digital giants in nuclear energy.
Google also announced the signing of an agreement with the start-up Kairos Energy , which specializes in the construction of small modular nuclear reactors (known as “SMRs” ), to finance its development and reserve part of its production capacity by 2030. Amazon, following suit, has entered into a similar partnership with the start-up X-energy .
The reason for these investments is simple: the exponential development of generative AI requires significant computing capacity , located in particularly energy-intensive data centers.
The latest studies show that AI accounts for between 10 and 20% of the electricity consumed by all data centers worldwide, which is increasing by 20 to 40% each year according to the International Energy Agency (IEA). In some countries, such as Ireland, data center consumption has even exceeded the amount of electricity consumed by households .
The excessiveness of these figures raises questions, especially in a context where the climate emergency is on everyone’s minds and while citizens are asked to limit their heating to 19 °C, is this race for computing capacity really sustainable and desirable? Should we really seek by all means to build new electricity production capacities to keep up with the pace of development of data centers?
The solutions to this crisis are not obvious, given the number of divergent interests and factors to consider. However, avenues for limiting the energy consumption of AI and the explosion in the number of data centers, such as taxation or regulation, are beginning to emerge in international discussions.
Why does AI need so much energy?
Every time we ask our favorite generative AI system a question, the request is sent over the Internet to be processed in a data center that may be located in different parts of the world. The data center consumes electricity to power the computing components it houses and its cooling system, not to mention the energy required to build the center and the electronic components themselves.
In recent years, the main AI models have become more complex and require ever-increasing computing power to operate, 4 to 5 times more each year since 2010 according to the most recent studies. At the same time, the number of users continues to increase, with more than 200 million users each week on ChatGPT alone .
These trends explain why AI vendors are increasingly demanding energy, investing heavily in renewables to power their systems, and planning to build new infrastructure around the world.
Why is the proliferation of data centers a problem for the planet?
The acceleration of demand for computing capacity linked to the generative AI trend is accompanied by significant negative effects on the environment.
First, the production of electricity consumed by data centers generates greenhouse gas emissions depending on the source used. These emissions already represent 1 to 3% of global emissions according to the IEA and are likely to increase if the number of centers increases.
Then, since data centers are particularly energy-intensive, they can affect the stability of the network at the local level. In an electrical network, the amount of electricity produced must always be equal to the amount of electricity consumed, otherwise it is a blackout . Adding infrastructures that consume a lot of electricity in geographical areas where the production-consumption balance is already fragile increases the risk of blackouts , especially when the energy mix is largely based on renewable energies, which are intermittent by nature.
Finally, the pace of AI development is completely outpacing the capacity to produce electricity from renewable energy sources such as photovoltaic panels or wind turbines. To meet their needs, digital giants will likely resort to carbon energy sources such as coal or gas, which are available more quickly. This is leading them to move catastrophically away from their carbon neutrality objectives, with Microsoft having posted a 29% increase in its emissions compared to 2020 and Google 48% compared to 2019. At the same time, they are communicating intensively about their investments in renewable energy in order to make people forget their poor environmental performance .
What solutions are there to address the AI energy crisis?
The solution is not necessarily to ban the construction of new data centers, for three reasons.
Indeed, the new data centers built by the digital giants are generally more efficient than the old infrastructures . The construction of new centers also responds to other issues since they contribute to the economic development of the territories (by creating jobs and activity on a local scale ) but also to the establishment of sovereign computing power (for example in Europe), less subject to the potential effects of geopolitical disputes on an international scale.
Moreover, unless there is a global moratorium on the construction of new infrastructure, locally banning projects will only lead to their relocation, potentially to countries where the energy mix is even more carbon-intensive, which is not desirable from an ecological point of view…
The urgency of international reflection on the regulation of data centers
Like the European Energy Efficiency Directive and the European Code of Conduct for Data Centers , it is essential to ensure that each new project uses the best available technologies in terms of energy efficiency, but also to avoid consumption growing due to the rebound effect , and is powered by low-carbon electricity. The more standards are harmonized at the global level, the lower the risk of relocation to countries with more flexible standards, but potentially less virtuous from an environmental point of view, will be.
Regulation of the number of data centres on a global scale could also be considered, via a global organisation, on the model of the International Telecommunications Union , which manages the allocation of radio frequencies.
A reflection on the taxation of data center operators is also necessary in order to determine whether it can be used to promote the supply of green energy and the adoption of more sustainable practices , via tax deductions or the establishment of a specific tax for the least virtuous operators. For example, this option was mentioned in the Senate’s information mission on the environmental footprint of digital technology in 2020, which led to the conditioning of a reduced tax for data centers meeting energy performance criteria, only in France.
Finally, it is also possible to act on the uses of AI. Raising public awareness of the environmental issues of AI would help guide uses towards a more virtuous use of the technology by limiting recreational uses for example.
Very often, in debates on the environmental footprint of AI, the need to balance the negative externalities linked to its development, such as those mentioned in this article, with the potential positive effects that AI can bring in different sectors, notably economic (wealth creation) or environmental (reduction of emissions by optimizing the energy efficiency of other activities) is mentioned.
While the argument is attractive and seems rational, hypothetical positive long-term effects cannot justify an unreasonable development of AI in the short term, causing irreversible damage to the environment and risking compromising our ability to leave a healthy environment to future generations.
Author Bio: Thomas Le Goff is a Lecturer in digital law and regulation at Télécom Paris – Institut Mines-Télécom