Sustainable AI, a utopia?

Share:

While we ask artificial intelligence (AI) to help us solve climate change, its own carbon footprint is skyrocketing. And while it helps us design drugs, optimize power grids, and predict natural disasters, this technology comes at a hidden and exorbitant cost.

The problem is its energy appetite. Training a model like GPT-3, now outdated, required around 1,300 MWh , the equivalent of the annual energy consumption of more than 120 homes . And that’s just the training: its daily use is even more demanding. It’s estimated that queries to ChatGPT can require ten times more energy than a simple Google search, amounting to 1,000 MWh every day worldwide .

This energy consumption is so colossal that tech giants are taking drastic measures. Microsoft, Alphabet (Google), and Amazon have signed agreements to purchase power from nuclear plants, securing the flow of watts for their data centers. AI is energy-hungry, and this is just the beginning.

Traffic jam in modern computing

Why does AI consume so much power? The answer lies in the architecture of our computers, designed decades ago. The problem is known as the von Neumann bottleneck : the constant flow of data between memory (where data waits) and the processor (where it is processed). It’s as if a cook could only grab one ingredient from the refrigerator at a time: they would spend more time going back and forth than actually cooking. This bottleneck generates latency and, above all, immense heat from energy dissipation.

For decades, Moore’s Law saved us: technology was able to double the number of transistors on a chip every two years, bringing the components—the ingredients in that kitchen—closer together. But we’re reaching the physical limit. Stacking more and more components on a 3D chip reduces the surface area available for cooling.

Electronic computing is “cooking” in its own success . The difference with the human brain is overwhelming: with energy consumption equivalent to that of a small light bulb, it can outperform powerful computers that waste enormous amounts of power.

More water and more waste

And it’s not just energy. Cooling these data centers consumes millions of liters of water. An average data center can use 9 liters of clean water for every kWh of energy.

Added to this is the hardware lifecycle : the rapid obsolescence of chips and servers generates a growing mountain of electronic waste. To make matters worse, their manufacture depends on the extraction of scarce minerals, often under conditions of questionable respect for human rights.

Between the end and the means

So, is AI the problem or the solution? Therein lies the great paradox. Despite its environmental impact, AI is also a crucial tool for sustainability.

The same technology that voraciously consumes energy is what we use to optimize electrical grids, manage the intermittency of renewable energy sources, and create “precision agriculture” that drastically reduces water and fertilizer use. Furthermore, it allows us to predict natural disasters further in advance and design sustainable medicines and new materials, while also optimizing transportation routes to reduce emissions.

AI can facilitate 134 of the 169 targets of the UN Sustainable Development Goals , although it can hinder 59. The question is not whether to use it or not, but how we can make it sustainable.

The “green” future: photons and nanotechnology

Although progress is being made in “lighter” models (through techniques like “pruning” or “distilling” ), the solution won’t come solely from optimizing the software . The real revolution must happen in the hardware .

This is where nanotechnology and new computing paradigms come into play , such as in-memory computing, which seeks to overcome the von Neumann bottleneck by designing chips that combine processing and memory in the same device; “ memristors ” are an example of this technology.

An even more radical idea is the photonic revolution, which proposes abandoning the use of electrons and starting to use photons (particles of light) . Since they have no mass and do not generate heat through friction, a photonic processor could be thousands of times more efficient.

Finally, analog computing is explored : unlike digital chips (which operate with 0s and 1s), these systems are inspired by the physics of natural systems to process information more fluidly, similar to our brain.

Challenges to overcome

The path to sustainable AI is not just a technical challenge: it’s a matter of governance. Initiatives like the National Green Algorithms Program in Spain are a first step. We need a holistic vision that combines technological innovation, proactive regulation, and a deep social and political awareness.

We are talking about a technology that has the potential to transform our world, but only if it does so without consuming the planet in the process.

Author Bio: Cefe López Fernández is a Research Professor (photonic materials) at the Spanish National Research Council (CSIC

Tags: