Home Artificial Intelligence What Is the Environmental Impact of AI?

What Is the Environmental Impact of AI?

What Is the Environmental Impact of AI?

Photo by Jason Blackeye on Unsplash

A small but vital a part of the AI Index Report for 2023¹ points to the growing concern concerning the energy consumption required for AI training.

Spoiler alert: it’s quite so much.

There’s no standard benchmark for tracking the carbon intensity of AI systems, so the report focuses on research from a recent paper by Luccioni et al., 2022² which records the energy requirements of quite a few large language models (LLMs) including ChatGPT.

The next table shows the energy requirements for training 4 different AI models and the CO2 emissions related to it.

Image by creator, data source: Luccioni, et al., 2022²

The information incorporates quite a few measurements but the underside line is represented by the ability consumption and CO2 emissions which I even have summarised within the charts below.

Power Consumption of 4 AI models — Image by creator, data source: Luccioni, et al., 2022²

There is sort of a difference between the assorted models and, as you’ll be able to see, OpenAI’s GPT-3 comes top with a consumption of over 1200 Megawatt-hours. That’s about as much electricity as 120 US homes would devour in a 12 months in accordance with consumption figures by the U.S. Energy Information Administration³. That actually looks like loads of energy.

The chart below illustrates the CO2 emissions which follow the same pattern.

CO2 emissions of 4 AI models — Image by creator, data source: Luccioni, et al., 2022²

Luccioni, the paper’s principal creator, is a researcher at Hugging Face Inc. and the work is usually concerned with BLOOM, her company’s alternative to ChatGPT. The figures for other models are approximate and based on what public information is on the market (Bloomberg reports Lucciana saying that nothing is absolutely known about ChatGPT and that it could just be “…three raccoons in a trench coat.” — does that mean GPT-4 will likely be 4 raccoons?).

CO2 emissions for training ChatGPT are such as around 500 flights from Latest York to San Francisco

The AI Index Report makes some comparisons with other energy-intensive activities and their CO2 emissions (see chart, below). It finds for instance, that the CO2 emissions generated in training ChatGPT are such as one passenger taking a flight from Latest York to San Francisco around 500 times! Or the whole energy consumption of a single American over 28 years!

Energy consumption comparisons: AI models and real-life examples. Image by creator, data source AI Index Report¹

Unsurprisingly, the one air passenger doesn’t produce zero emissions as it could appear from the chart above (the figure is sort of 1 tonne). You may see the actual numbers more clearly on this table:

Energy consumption comparisons: AI models and real-life examples. Image by creator, data source AI Index Report¹

However it’s not all bad news.

AI can even reduce energy consumption

In response to Bloomberg, while AI models are getting larger (and presumably more energy intensive), the businesses creating them are working on improving efficiency. Microsoft, Google and Amazon — the cloud corporations that host much of the work — all are aiming for carbon-negative or carbon-neutral operations. That is, in fact, highly desirable.

Also, while training AI systems is energy-intensive, recent research shows that AI systems may also be used to optimize energy consumption. A paper from DeepMind⁴ released in 2022 details the outcomes of a 2021 experiment by which it trained an AI called BCOOLER to optimize cooling in Google’s data centres.

Energy saving by BCOOLER. Image by creator, data source AI Index¹

The graph above shows the energy-saving results from one BCOOLER experiment. After three months, a, roughly, 12.7% energy saving was achieved.

Even when carbon neutrality is achieved, the usage of AI to extend the efficiency of those centres will even make them cheaper to run. Perhaps we must always be fascinated about applying AI to other energy-intensive industries, too.

I doubt that we’re currently ready to know exactly what the eventual toll on the environment will likely be. LLMs like ChatGPT aren’t going away and so the energy that should be spent in training them is unquestionably going to be spent. Alternatively, it’s not the case that individuals are going to stop flying NY to SF, heating their homes or using their cars.

But we must always attempt to put a few of this somewhat shocking data into perspective. While a ChatGPT training session might use as much energy as one American does in 28 years (which sounds an awful lot), additionally it is true that 330 million Americans, the population of the USA, emit around 10 million times more CO2 than a single ChatGPT session⁵.

And there seem like around 20 flights a day from Latest York to San Francisco, and say that every flight serves 150 passengers; that works out to be over 1 million tonnes of CO2 emissions per 12 months — greater than 2000 ChatGPTs⁵.

For single entities, ChatGPT, and its like, clearly use loads of energy (and thus — in the mean time, at the least — produce loads of CO2 emissions) but in comparison with energy consumption and CO2 emissions from other human activity, are they really very significant (there are, in any case, so much more humans than LLMs)?

Also, it’s got to be excellent news that the massive cloud hosting corporations are aiming to realize carbon neutrality which, if achieved, will reduce CO2 emissions to zero. So while energy use might remain high, the aim is to make its environmental impact neutral.

Moreover, AI will be used to mitigate among the energy use in data centres. Perhaps similar technology may very well be utilized in airlines and other energy-intensive industries.

The underside line, nonetheless, is that we’re all producing more CO2 than we must always, so any additional energy use, that shouldn’t be produced from renewables, is moving within the improper direction.

Thanks for reading, I hope you found this convenient. In the event you would really like to see more of my work, please visit my website.

It’s also possible to get updates by subscribing to my occasional, free, newsletter on Substack.

In the event you aren’t a Medium member you’ll be able to join using my referral link and get to read any Medium content for less than $5 per 30 days.


  1. The AI Index 2023 Annual Report

Nestor Maslej, Loredana Fattorini, Erik Brynjolfsson, John Etchemendy, Katrina Ligett, Terah Lyons, James Manyika, Helen Ngo, Juan Carlos Niebles, Vanessa Parli, Yoav Shoham, Russell Wald, Jack Clark, and Raymond Perrault, “The AI Index 2023 Annual Report,” AI Index Steering Committee, Institute for Human-Centered AI, Stanford University, Stanford, CA, April 2023.

The AI Index 2023 Annual Report by Stanford University is licensed under Attribution-NoDerivatives 4.0 International.

You will discover the whole report on the AI Index page at Stanford University.

5. CO2 emissions from other sources (these are rough calculations):

330 million Americans emit 18 tonnes of CO2 annually, that’s 330m x 18, 5900m tonnes of CO2–10 million ChatGPTs.

Approx. 20 flights (every day), NY to SF, with around 150 passengers on board produce 20 x 150, or 3000 tonnes of CO2. That’s 3000 x 365, about 1 million tonnes of CO2 per 12 months — 2000 ChatGPTs.



Please enter your comment!
Please enter your name here