Generative AI’s environmental costs are soaring and mostly secret – Nature.com

Posted: February 20, 2024 at 6:55 pm

Last month, OpenAI chief executive Sam Altman finally admitted what researchers have been saying for years that the artificial intelligence (AI) industry is heading for an energy crisis. Its an unusual admission. At the World Economic Forums annual meeting in Davos, Switzerland, Altman warned that the next wave of generative AI systems will consume vastly more power than expected, and that energy systems will struggle to cope. Theres no way to get there without a breakthrough, he said.

Im glad he said it. Ive seen consistent downplaying and denial about the AI industrys environmental costs since I started publishing about them in 2018. Altmans admission has got researchers, regulators and industry titans talking about the environmental impact of generative AI.

So what energy breakthrough is Altman banking on? Not the design and deployment of more sustainable AI systems but nuclear fusion. He has skin in that game, too: in 2021, Altman started investing in fusion company Helion Energy in Everett, Washington.

Is AI leading to a reproducibility crisis in science?

Most experts agree that nuclear fusion wont contribute significantly to the crucial goal of decarbonizing by mid-century to combat the climate crisis. Helions most optimistic estimate is that by 2029 it will produce enough energy to power 40,000 average US households; one assessment suggests that ChatGPT, the chatbot created by OpenAI in San Francisco, California, is already consuming the energy of 33,000 homes. Its estimated that a search driven by generative AI uses four to five times the energy of a conventional web search. Within years, large AI systems are likely to need as much energy as entire nations.

And its not just energy. Generative AI systems need enormous amounts of fresh water to cool their processors and generate electricity. In West Des Moines, Iowa, a giant data-centre cluster serves OpenAIs most advanced model, GPT-4. A lawsuit by local residents revealed that in July 2022, the month before OpenAI finished training the model, the cluster used about 6% of the districts water. As Google and Microsoft prepared their Bard and Bing large language models, both had major spikes in water use increases of 20% and 34%, respectively, in one year, according to the companies environmental reports. One preprint1 suggests that, globally, the demand for water for AI could be half that of the United Kingdom by 2027. In another2, Facebook AI researchers called the environmental effects of the industrys pursuit of scale the elephant in the room.

Rather than pipe-dream technologies, we need pragmatic actions to limit AIs ecological impacts now.

Theres no reason this cant be done. The industry could prioritize using less energy, build more efficient models and rethink how it designs and uses data centres. As the BigScience project in France demonstrated with its BLOOM model3, it is possible to build a model of a similar size to OpenAIs GPT-3 with a much lower carbon footprint. But thats not whats happening in the industry at large.

It remains very hard to get accurate and complete data on environmental impacts. The full planetary costs of generative AI are closely guarded corporate secrets. Figures rely on lab-based studies by researchers such as Emma Strubell4 and Sasha Luccioni3; limited company reports; and data released by local governments. At present, theres little incentive for companies to change.

There are holes in Europes AI Act and researchers can help to fill them

But at last, legislators are taking notice. On 1 February, US Democrats led by Senator Ed Markey of Massachusetts introduced the Artificial Intelligence Environmental Impacts Act of 2024. The bill directs the National Institute for Standards and Technology to collaborate with academia, industry and civil society to establish standards for assessing AIs environmental impact, and to create a voluntary reporting framework for AI developers and operators. Whether the legislation will pass remains uncertain.

Voluntary measures rarely produce a lasting culture of accountability and consistent adoption, because they rely on goodwill. Given the urgency, more needs to be done.

To truly address the environmental impacts of AI requires a multifaceted approach including the AI industry, researchers and legislators. In industry, sustainable practices should be imperative, and should include measuring and publicly reporting energy and water use; prioritizing the development of energy-efficient hardware, algorithms, and data centres; and using only renewable energy. Regular environmental audits by independent bodies would support transparency and adherence to standards.

Researchers could optimize neural network architectures for sustainability and collaborate with social and environmental scientists to guide technical designs towards greater ecological sustainability.

Finally, legislators should offer both carrots and sticks. At the outset, they could set benchmarks for energy and water use, incentivize the adoption of renewable energy and mandate comprehensive environmental reporting and impact assessments. The Artificial Intelligence Environmental Impacts Act is a start, but much more will be needed and the clock is ticking.

K.C. is employed by both USC Annenberg, and Microsoft Research, which makes generative AI systems.

Follow this link:

Generative AI's environmental costs are soaring and mostly secret - Nature.com

Related Posts