Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
MONTREAL: If you care about the environment, think twice about using AI. Generative artificial intelligence uses 30 times more energy than a traditional search engine, warns researcher Sasha Luccioni, on a mission to raise awareness about the environmental impact of the hot new technology.
Recognised as one of the 100 most influential people in the world of AI by the American magazine Time in 2024, the Canadian computer scientist of Russian origin has sought for several years to quantify the emissions of programmes like ChatGPT or Midjourney.
“I find it particularly disappointing that generative AI is used to search the Internet,” laments the researcher, who spoke on the sidelines of the ALL IN artificial intelligence conference, in Montreal.
The language models on which the programmes are based require enormous computing capacities to train on billions of data points, necessitating powerful servers.
Then there’s the energy used to respond to each individual user’s requests. Instead of simply extracting information, “like a search engine would do to find the capital of a country, for example,” AI programmes “generate new information,” making the whole thing “much more energy-intensive,” she explains.
According to the International Energy Agency, the combined AI and the cryptocurrency sectors consumed nearly 460 terawatt hours of electricity in 2022 — two per cent of total global production.
Energy efficiency
A leading researcher on the impact of AI on climate, Luccioni participated in 2020 in the creation of a tool for developers to quantify the carbon footprint of running a piece of code. “CodeCarbon” has since been downloaded more than a million times.
Head of the climate strategy of startup Hugging Face, a platform for sharing open-access AI models, she is now working on creating a certification system for algorithms.
Similar to the programme from the US Environmental Protection Agency that awards scores based on the energy consumption of electronic devices and appliances, it would make it possible to know an AI product’s energy consumption in order to encourage users and developers to “make better decisions.” “We don’t take into account water or rare materials,” she acknowledges, “but at least we know that for a specific task, we can measure energy efficiency and say that this model has an A+, and that model has a D,” she says.
Transparency
In order to develop her tool, Luccioni is experimenting with it on generative AI models that are accessible to everyone, or open source, but she would also like to do it on commercial models from Google or ChatGPT-creator OpenAI, which have been reluctant to agree.
Although Microsoft and Google have committed to achieving carbon neutrality by the end of the decade, the US tech giants saw their greenhouse gas emissions soar in 2023 because of AI: up 48pc for Google compared to 2019 and 29pc for Microsoft compared to 2020.
“We are accelerating the climate crisis,” says Luccioni, calling for more transparency from tech companies. The solution, she says, could come from governments that, for the moment, are “flying blindly,” without knowing what is “in the data sets or how the algorithms are trained.” “Once we have transparency, we can start legislating.”
‘Energy sobriety’
It is also necessary to “explain to people what generative AI can and cannot do, and at what cost,” according to Luccioni. In her latest study, the researcher demonstrated that producing a high-definition image using artificial intelligence consumes as much energy as fully recharging the battery of your cell phone.
At a time when more and more companies want to integrate the technology further into our lives — with conversational bots and connected devices, or in online searches — Luccioni advocates “energy sobriety.” The idea here is not to oppose AI, she emphasizes, but rather to choose the right tools — and use them judiciously.
Published in Dawn, September 16th, 2024