The gigantic and as yet unimagined energy consumption of artificial intelligence – La

The rapid development of artificial intelligence in recent months could not have worked without the regular integration of new data. This process is necessary for proper functioning and enables increasingly finer and more appropriate analyzes of artificial intelligence. However, these municipalities are anything but energy neutral.

In a recent study published in the journal Joule (and forwarded by Géo), graduate student Alex de Vries from the Free University of Amsterdam attempted to estimate the power consumption of servers running artificial intelligence.

Previous research on models such as ChatGPT cited by the author has already pointed out the significant energy consumption of these systems. “BLOOM,” a multilingual text generation tool and one of the examples cited by the researcher, “used approximately 433 megawatt-hours (MwH) during its training phase, equivalent to the energy consumption of 40 average American homes over the course of a year,” Geo reports.

Given the increasingly widespread use of artificial intelligence, Alex de Vries expects consumption to be between 85 and 134 TWh per year by 2027. To better illustrate this risk, the author chooses a particularly energy-intensive scenario using Google as an example. If tomorrow’s search engine were to process the billions of search queries every day using artificial intelligence, this alone would mean 29.2 terawatt hours (TWh) of electricity per year. A figure that may seem obscure, but it roughly reflects Ireland’s annual consumption.

Currently, Google is only in the testing phase of integrating AI into its search engine, and the graduate student realizes that his hypothesis is “extreme” and “unlikely in the short term.” Despite everything, he sounds the alarm: “This potential growth shows that we have to be very careful when using AI. It uses a lot of energy, and we don’t want to use it for all sorts of things that we don’t use. “I don’t really need that.”