One training session with GPT-3 uses the same amount of energy as 126 homes in Denmark do in a year

Rachel England

November 18, 2020

3 Min Read

One training session with GPT-3 uses the same amount of energy as 126 homes in Denmark do in a year

There’s little doubt that artificial intelligence and machine learning are proving critical in the fight against climate change. From computational agriculture to tracking wildlife patterns, the technology is helping us make strides towards a more sustainable future.

However, it comes with environmental challenges of its own – namely the amount of CO2 produced by high-performance servers used to create and maintain AI models.

To help developers get a better handle on the impact of their tech, students from the University of Copenhagen have developed a tool to predict the carbon footprint of algorithms.

According to the team, the amount of compute used for deep learning grew 300,000% in the six years leading up to 2018, with that figure set to increase exponentially. One of the biggest AI projects to date, OpenAI’s GPT-3 language model, is estimated to use the same amount of energy in a single training session as 126 Danish homes do in a year, emitting the same volume of CO2 as a car driven for 700,000 miles.

"Should the trend continue, artificial intelligence could end up being a significant contributor to climate change,” said Benjamin Kandig, who along with Lasse F. Wolff Athony helped develop the new tool.

“Jamming the brakes on technological development is not the point. These developments offer fantastic opportunities for helping our climate. Instead, it is about becoming aware of the problem and thinking: How might we improve?"

Helping developers make greener decisions

Their solution is Carbontracker, a free app that gathers information on how much CO2 is used to produce energy in whichever region the deep learning model is being trained. This makes it possible to convert energy consumption into CO2 emission predictions, giving developers an understanding of the environmental impact of their work, as well as an indication of ways they can help reduce it.

For example, the time of day, the electricity supplier used and the degree of hardware efficiency all play a role in an algorithm’s eventual carbon footprint.

"It is possible to reduce the climate impact significantly,” Anthony explained. “For example, it is relevant if one opts to train their model in Estonia or Sweden, where the carbon footprint of a model training can be reduced by more than 60 times thanks to greener energy supplies. Algorithms also vary greatly in their energy efficiency. Some require less compute, and thereby less energy, to achieve similar results. If one can tune these types of parameters, things can change considerably.”

The carbon footprint of artificial intelligence is a topic that’s gaining increasing prominence, with some claiming it could end up overshadowing the environmental benefits the tech is set to deliver.

Carbontracker is not the first effort designed to help mitigate the problem. Back in April, for example, MIT unveiled a system designed to cut the energy required for training and running neural networks, while in July researchers from Stanford University announced its ‘experiment impact tracker’, along with a number of recommendations for developers seeking to reduce their carbon footprint.

About the Author(s)

Rachel England

Freelance journalist Rachel England has covered all aspects of technology for more than a decade. She have a particular interest in sustainability-focused tech innovation, and has once attended a green business expo dressed as a recycling bin.

Keep up with the ever-evolving AI landscape
Unlock exclusive AI content by subscribing to our newsletter!!

You May Also Like