AI singled out for its astronomical environmental cost
If artificial intelligence is progressing at high speed, its energy impact is growing at an…
If artificial intelligence is progressing at high speed, its energy impact is growing at an exponential rate. ChatGPT landed at least November, partially revolutionizing the industry and rushing tech companies towards developing their own AI software. Problem: AI already seems like a bottomless energy sink.
Because if digital technology already has a major environmental impact (device production, power, data storage), artificial intelligence is even more energy-intensive: updating (learning) a simple AI model could consume the equivalent of 100 American households over a year, according to Bloomberg estimates. And each model must be trained regularly to be up to date with new information.
Not to mention the data storage needs: each new information “learned” by the AI is recorded in a cloud, physically stored on thousands of microchips in huge servers, which must be ventilated.
The better the AI systems, the more they consume. The researchers estimate that the development of a simple algorithm for an AI represents only 40% of the real consumption of the tool: the major part of the ecological impact is done downstream, with regular training and the registration of user requests.
Artificial intelligence is based on the idea of lifelong learning, and over time the amount of data that algorithms are able to analyze expands. For example, the previous version of ChatGPT leveraged 1.5 billion parameters and variables. ChatGPT 3, the latest version, uses some 175 billion. And Open AI, parent of the software is already working on the fourth installment.
The case of Google is also telling: researchers have estimated that AI represents between 10 and 15% of the company’s energy needs, or 2.3 terawatt hours in 2021. This represents the consumption of homes in the city of ‘Atlanta on the same year.
But these data are only estimates. This is where the second problem comes in: tech companies are particularly discreet on the subject. Developing an AI is seller for investors as for the general public, but polluting massively is much less so. The acceleration of the sector is so recent that there is currently no overall estimate of its energy impact.
Carbon neutrality objectives
“We take our responsibility to stop and reverse climate change very seriously, and we think a lot about how to best use our computing power,” Open AI explained in a recent press release. The problem is mounting and companies will have to meet environmental expectations one day or another. Several tech companies have already committed to carbon neutrality by 2030, like Google.
This situation resembles the one experienced by the cryptocurrency mining sector: the mining industry is extremely energy-intensive to “create” new crypto units, in particular bitcoins. In response, China outright banned mining on its territory, while the United States wants to set up a permit system for companies to control the sector.
In both sectors, solutions exist. AI companies could, for example, commit to using carbon-free energy to power computers and data centers. Or even run the learning software overnight, at off-peak hours, when energy is more available, but also cheaper.