The Hidden Environmental Impacts of AI

Leyla Acaroglu
13 min readFeb 19, 2024

--

Artificial Intelligence (AI) has exploded into our lives and workplaces in the last 12 months, but as you ask ChatGPT (or any of its competitors) to run a few work tasks for you, or when you get prompted by one of the built-in AI that are becoming ubiquitous across digital platforms, have you stopped to think of the ecological and social impacts of the rush to use this new technology?

Last week, the founder of Open AI (the platform behind ChaptGBT) Sam Altman went public with his bid to raise 7 trillion dollars (yes that’s a T not a B!) to ramp up computer chip manufacturing so that the computational power of AI can expand more rapidly.

This article is about the impact that AI learning models have on the environment and how this proposed expansion of AI would dramatically increase impacts, so read on to find out more.

Just like mining Bitcoin, the creation of AI models requires a significant amount of computational power to get to the point where you are provided with quick (but not always accurate) responses. Although measuring the exact impact is pretty hard (as it is with mining Bitcoin), according to Statista, “The amount of data today’s digitised economy creates is growing annually by 40% and is expected to reach 163 trillion gigabytes by 2025, which will further fuel growth in AI.”

Before we get into the details of where and how Generative AI is having impacts on the planet and people, let’s start with the overarching concept of measuring the size of an impact on the environment. One effective way to do this is through what’s called an ecological footprint.

What is an Eco Footprint?

Ecological footprints are an account of the demand for and the supply of nature’s biocapacity as a result of human activities. The “footprint” is the collective size of the impact, measured against a series of different data sets such as land, water, natural resources and energy use.

The approach was developed and is promoted by the Global Footprint Network in the 1990s. The methods used to measure the impacts are designed to gauge the human demand on natural capital (which is the stock of all of nature’s resources), by looking at the amount of natural resources needed to support a country, region, product or the entire planet. We currently use the resources of 1.75 planets to sustain the consumption demands of humanity, so that means we exceed the capacity for the Earth to regenerate itself well before the end of each year.

Since the late 1970s, we have been using up resources at a faster rate than the Earth can replenish, which is the concept behind Earth Overshoot Day. Based on the science, this demonstrates what day each year we use up all of the resources that Earth could provide us without having a negative impact. In 2023, it fell on August 2nd, which is not surprising considering that the Ecological Footprint accounts show that humanity collectively demands 75% more from our planet than its ecosystems can regenerate.

Image Source

Essentially, an ecological footprint explains how much nature we use and how much we have to use if we were to be sustainable, accounting for the deficit between these two. And there are spin-offs like water and carbon footprinting methods too, which look at the specific impact on these impact categories. Whilst there is some controversy around carbon footprints (specifically in relation to personal rather than product-level ones), the overarching ecological footprint method has a solid scientific foundation and is a valid mode of understanding generalized impacts.

“The Environmental Footprint methods measure and communicate about the environmental performance of products (both goods and services) and organisations across their whole lifecycle, relying on scientifically-sound assessment methods agreed at international level.” — EU Commision

Carbon footprints assess the amount of carbon-producing energy used to do something. In fact, it’s not just carbon emissions — there are several greenhouse gases that are measured for their global warming potential. The global standard for measuring carbon impacts is through the Greenhouse Gas Protocol. Given that Generative AI is so new, the science is still catching up to measuring it, but as we enter into a carbon-constrained economy as a result of climate change, the AI sector will have to be more accountable for its impacts because, as you will see below, they are pretty significant.

The Carbon Footprint of AI

MIT Technology Review recently published an article, stating, “There’s one thing people aren’t talking enough about, and that’s the carbon footprint of AI. One part of the reason is that big tech companies don’t share the carbon footprint of training and using their massive models, and we don’t have standardised ways of measuring the emissions AI is responsible for. And while we know training AI models is highly polluting, the emissions attributable to using AI have been a missing piece so far. That is, until now.”

So, AI is just the newest contributor to our resource-hungry society that often ignores the invisible impacts associated with delivering the goods and services that end up in our lives (all of which are interconnected parts of the systems failures that I advocate adopting a systems mindset for in order to solve complex problems!). The tech sector is a significant contributor to climate change, on par with the airline industry.

The full scope of AI’s specific impact is still unknown, but we can cobble together a picture of what’s going on and where the industry needs to make changes to avoid AI being an ecological nightmare.

A Hungry Growing Machine Needs to be Fed

There are many things that go into ensuring you can get a result from your Generative AI search:

  • Devices and sensors that are used to capture the data (which comes from a rapid “crawl” across global, online open text and subsequent extraction of the most relevant digitized text in can access to craft its output)
  • Large banks of servers that power and store the networks for communicating the data (these have to be made from mined minerals and powered by vast amounts of energy, and then, since they generate a lot of heat, cooled with water in cooling towers)
  • The computational power needed to train the algorithm to start with (not to mention all the resources needed to set up the AI companies, etc.)
  • People who helped to train the datasets (it’s actually surprising how much human there is in AI!)
  • Your internet-connected device needed to conduct the search (and all the internet network cables and infrastructure required to enable you to connect, power your inputs, etc.)

At every step of the data collection and transference life cycle, there are many natural materials required to create the physical assets. These are manufactured and made into usable goods, with lots of energy needed to power and cool the devices. All these assets need to be mined, processed, produced, distributed, sold, installed, used and then end up as waste when no longer functional.

There are countless ecological and social impacts across this value chain, from the conflicts and conditions of the workers mining the valuable materials that go into making chips to the pollution caused from the processing of these materials. There are also reports of underpaid, overexploited workers being used to support the responses in Generative AI systems and significant issues with bias embedded in the training methods used.

But by far, the biggest ecological bit is the amount of energy required to train and power the algorithms. Just as a growing child needs more food inputs to power the transformations in their body, so does the growing learning system of artificial intelligence.

OpenAI, the creators behind the wildly popular ChatGPT, say themselves that the energy used to train the average AI model will increase 10 fold each year. Some believe machine learning is on track to consume all the energy that can be supplied.

Image Source

To get AI to be able to generate responses to your prompts requires a lot of training. Just like many everyday consumers are deceived into thinking there’s some magical process that whisks away their recycling and turns it into something new with no issues, something similar is happening with how we think we can just input a prompt into an AI platform and it magically creates content that passes for being written by a living, breathing human without any significant issues or impacts.

Indeed, the training phase of AI is extremely energy-intensive, as the system is being fed human produced data that it is required to remember. It’s during the training phase that a growing AI must gobble up a lot of existing content (which is where all the IP issues stem from) and process the content into a bank so that it can respond to your requests. With advanced systems such as the newer versions of ChatGPT, this training is extensive and requires thousands of Graphics Processing Units (GPUs) and high processing chips, all working in harmony for days or months to get the AI trained up ready for commercial use. The more complex the system, the more energy the training requires.

Once trained, the AI is then ready for inference (the process of running live data through a trained AI model to make predictions or solve tasks). This stage uses less energy than the training phase, but it’s still enough to generate a significant impact. Models do use less energy after they have been trained.

Fascinatingly, AI can forget things it’s already learned as it learns new things! So, this may increase the impact, as it would need to be retrained on things it had previously known but forgot.

The Graphic Processing Units (GPU) are the ones in hot demand for expanding the AI ecosystem and what the 7 Trillion dollars is for. Without these, the high energy computational processing for training can’t be done, and so this is why Sam Altman is trying to raise a lot of money to supercharge the chip manufacturing sector.

Knowing that there are a lot of things (from nature) that need to go into making these tiny computer chips, there is concern about the astronomical increase in impact this would have. These chips are made from silicon, which is in hot demand for many different sectors. Interestingly, one of the major producers of these chips is using them to help model environmental disasters, but they themselves don’t currently report their environmental footprint.

And using Graphic Processing Units are the most energy-hungry parts of the training phase. These demand a lot of electricity and specialized data center rack space to run them. “In rough terms, the average rack to support AI requires around the same amount of power as 25 houses.” Whilst some cloud computing companies have started to focus on reducing the footprint of their data centers by setting up facilities in climate-optimal areas (such as Iceland) and using more renewable energy, it still leaves a significant carbon footprint.

A 2020 article in the scientific journal Nature said that it’s “estimated that the carbon footprint of training a single big language model is equal to around 300,000 kg of carbon dioxide emissions.” That’s the same as 125 round-trip flights between Beijing and New York.

In training AI models, GPT-3 “resulted in 552 metric tons of carbon emissions, equivalent to driving a passenger vehicle for over 2 million kilometres. Nvidia, a chip giant supplying processors for training these AI models, plays a significant role in the carbon equation.”

Image Source

Despite the data we do have, many experts in the field of analyzing impacts of new technology say it’s still too early to know the full scope of the impact of AI. But what we do know is that it’s pretty bad and it’s likely to get worse unless there are significant changes to the way the industry works.

The Guardian reported that, “By 2030, machine learning training and data storage could account for 3.5% of all global electricity consumption. Pre-AI revolution, datacentres used up 1% of all the world’s electricity demand in any given year,” whereas Swedish researcher Anders Andrae has forecast that data centers could account for 10% of total electricity use by 2025.

The Water Footprint of AI

A water footprint is the calculation of both the direct amount of water used to make and operate something and the water used across the supply chains. AI systems consume fresh water via onsite server cooling and offsite electricity generation. The energy used to power the servers generates a lot of heat which means they have to be cooled down, and that’s done with a massive amount of freshwater.

Staggeringly the amount of water used to do this is enough to Fill 2,500 Olympic-Sized Pools in one year. It’s estimated that for every 20–50 questions asked, ChatGPT requires 500 ml of water, which is a standard personal water bottle worth of water.

Big tech is behind a lot of the AI research and mega data center builds. Microsoft, for example, is heavily involved in ChatGPT and saw its water use jump 33% between 2021–2022 as a result.

Fortune reports: “In its latest environmental report, Microsoft disclosed that its global water consumption spiked 34% from 2021 to 2022 (to nearly 1.7 billion gallons, or more than 2,500 Olympic-sized swimming pools), a sharp increase compared to previous years that outside researchers tie to its AI research.”

Researchers at Cornell University have predicted that AI demand could be “accountable for 4.2–6.6 billion cubic meters of water withdrawal in 2027, which is more than the total annual water withdrawal of 4–6 Denmark or half of the United Kingdom”.

Two Sides to Every Coin

There are, of course, sustainability benefits to AI; the technology can be used to come up with novel solutions to climate change, resource use and the waste crisis. It can also be used to monitor land clearing , find poachers and address the SDGs.

This recent article on the ethics of AI shared: “AI applications may lead to substantial emissions but may also play an important role in mitigation and adaptation. Given this dual role of AI, ethical considerations by AI companies and governments are of vital importance.”

The lack of data and real-time knowledge of the impact of this new technology is concerning for those of us who are working to get global action on the nature, climate and waste crises. But there is a similar digital scenario that we can learn from, and that’s cryptocurrencies.

A Case Study in Impact Reduction with Cryptocurrencies

AI and cryptocurrencies are both emerging technologies that use massive amounts of energy from the onset. However, cryptocurrencies have been around for longer and are slightly ahead of AI when it comes to finding ways to reduce their impact, so let’s quickly explore how this happened in part of the industry.

Until about three years ago, nearly all crypto was sucking significant amounts of energy to mine coins and facilitate transactions, with one of the worst offenders being Bitcoin.

In the US, Crypto asset operations use between 0.9 and 1.7 percent of the US’s available electricity. Countries like China kicked out Crypto miners and the issues got worse, sending mining operations to countries with more carbon-intensive electricity systems, according to the New York Times.

“Bitcoin is estimated to consume about 150 terawatt hours a year, which is more electricity than 45 million people in Argentina use. Ethereum is closer to Switzerland’s 9 million citizens, eating up about 62 million terawatt hours…Ethereum is estimated to emit carbon dioxide at a similar scale to Denmark or Chile.” — source

The Bitcoin Energy Consumption Index updates in real time and shows how much energy is required to mine this cryptocurrency.

Image Source

In 2022, Etherium launched a new algorithm and adopted a Proof of Stake (PoS) mining process which was able to reduce the carbon emissions by 99%. This has resulted in a more stable and efficient system. It took them 8 years to develop the new approach referred to as “the Merge”.

The old system called “Proof of Work” (PoW) is energy intensive because a large bank of computers have to work to find (mine) the currency, which is a bit like unscrambling a puzzle. The new system was created by merging two blockchains and migrated from the old system to a new one that had been tested over several months to ensure its viability.

For more info on this, see this UN report, this fact checking article by TIME on Crypto’s environmental impact and this article on the details of the Ethereum move.

This demonstrates that with an impact-reduction mindset and investment from companies participating in these new technologies, engineering solutions can result in dramatic impact reductions — which is what we want to see happen to AI.

How to Reduce the Footprint of AI

I’ve pulled together a few resources on how AI’s impact can be reduced:

Whilst there are certainly big changes that the tech sector needs to make to account for both their historical and current impacts, as well as design solutions that mitigate and rectify these, there is also scope for society to question the hidden impacts of the things that are thrust upon us.

AI can certainly play a big part in society, but at what cost? Just because we are given “magical” tools doesn’t mean that they are destined to be a dominant part of our lives, or that they will create collectively-beneficial outcomes.

We all have choices about what and how we engage with the things presented to us — so what type of relationship do you want to have with AI? Personally, I would really like to see its impact assessed and addressed by those who are profiting off of it, otherwise we will inadvertently perpetuate the unsustainability of tech products as we rush to expand AI.

--

--

Leyla Acaroglu

UNEP Champion of the Earth, Designer, Sociologist, Sustainability & Circular Provocateur, TED Speaker, Founder: unschools.co, disrupdesign.co & swivelskills.com