The role of AI in achieving net-zero

Noam Rosen
11 March, 24

Climate change is one of the most complex challenges for humanity. To help tackle it we need to turn to technology such as artificial intelligence (AI). However, the technology, particularly generative AI, necessitates significant computational resources, leading to substantial energy consumption. This escalating demand for computing power presents a growing issue. The requirements for state-of-the-art AI models are doubling every five to six months, with projections indicating a continued rise in demand. Currently, data centres consume up to 1.5% of global electricity supply, contributing significantly to man-made greenhouse gas emissions, accounting for approximately 75% of such emissions in the EU.

Recent research by Gartner®[1] predicts that, “by 2030, AI could help reduce global GHG emissions by 5% to 10%”. However, by the same year, Gartner® predicts that “AI could consume up to 3.5% of the world’s electricity”.[2]

The tech industry is facing a clear challenge: to find solutions to curb the energy demands of AI, and thus unlock the technology’s full potential to help the human race.

AI and energy

The power required by AI is due to two factors: energy is consumed when models are trained, and during inference, where live data is run through a trained AI model to solve tasks. Research published in the journal Joule suggests that inference can account for at least 60% of the energy consumption of generative AI, and that adding AI capabilities to web searches can multiply energy demands tenfold. There also tends to be an increased volume of queries when engaging with a generative model compared to a search engine, due to the back-and-forth dialogue as users try to achieve their desired result. 

As new use cases for generative AI emerge around text, images and video, there will also be an increase in large models being trained, retrained and fine-tuned on a daily basis. The recent class of generative AI models require more than a 200-fold increase in computing power to train compared to previous generations. Every new generation of models requires more computing power for inference, and more energy to train. It’s a constant cycle that continually adds demand onto the required infrastructure.

In terms of hardware, the graphics processing units (GPUs) used for AI can expend many times the energy of a traditional CPU system. Today’s GPU’s can consume up to 700 watts, and an average installation takes eight GPUs per server. This means a server could be consuming nearly six kilowatts, compared to one kilowatt for the traditional two-socket server unit enterprises use for virtualisation. So, the big question is, how can we make this more sustainable?

The solution

The first step is to understand that sustainability is a journey: there is no singular action that can ‘fix’ it when it comes to AI. But small steps can make a big difference. The computing industry is being sent a loud, clear message to create better products that use fewer resources. This call is coming from consumers and investors, but also increasingly from governments. Being energy efficient will in future be a legal requirement for organisations in the AI space. Recent amendments to the EU AI Act will mandate that operators adopt state-of-the-art methods to cut energy consumption and enhance the efficiency of their AI platforms. 

This can be achieved in three specific technical ways: first in the chips used to generate the computational power, second in the computers built for those chips, and third in the data centre. Sustainability is increasingly becoming a competitive differentiator both for chip makers and PC makers and will become more so as companies make the effort to achieve ESG goals. In the coming decades, new advances such as analogue chips could offer an energy-efficient alternative, perfect for neural networks, according to research in the journal Nature

In the data centre, older air-cooling technologies are already struggling to deal with the high energy demands of AI, and customers are turning to liquid cooling to minimise energy consumption. By efficiently transferring the heat generated by generative AI into water, customers can save up to 30-40% on electricity. Data centres driven by renewable energy sources will be key to reducing AI’s carbon footprint.  ‘As a service’ approaches to AI technology can also help to minimise waste and ensure that organisations are using the newest, most sustainable hardware, without up-front capital outlay. 

The benefits of AI

There is a trade-off around AI and its energy demands that needs to be discussed. Some are using AI for the benefit of humankind, by improving medicine or tackling climate change, for example, while others are using it to generate entertainment. This raises questions around whether we should view those different energy demands differently.

It is certain that AI has enormous potential to do good, already having an impact in many areas. There are dozens of examples of how AI holds the potential to mitigate the impacts of climate change, with the UN pointing out that it is not only helping to better forecast and understand extreme weather, but also offering direct help to communities impacted by this.

In addition, AI can offer new understanding of the world around us, which could in turn help to curb greenhouse gas emissions. In smart cities, it has potential to minimise emissions by saving minutes or hours of heating and air conditioning at city scale, by learning people’s habits and turning heating or air con down gradually in the hour before they leave their homes. The technology can also regulate traffic across a city, so that vehicles drive efficiently and traffic jams are prevented. Norwegian start-up Oceanbox.io is harnessing predictive AI on its mission to understand the depths of the ocean, forecasting the movement of currents which can help to combat the spread of pollution and help vessels to reduce their petrol use. 

AI’s role in achieving net-zero

There is no question that AI uses a lot of power, but we can tackle this step by step – by using warm water cooling instead of air cooling, harnessing renewable energy sources to drive data centres, and through innovations in chip and computer design.

In numerous aspects, AI can provide benefits for humanity and serve as a real catalyst in advancing the UN’s Sustainable Development Goals. It holds the promise of enhancing our comprehension of climate change’s origins and addressing it, mitigating inequality, and safeguarding our marine environments and woodlands. When employed ethically, AI can align seamlessly with sustainability aims, and as global efforts converge toward achieving net-zero emissions, AI will progressively take a pivotal role.


[1] Gartner® IT Symposium/Xpo, Barcelona, November 8 2023, AI for Environmental Sustainability: Trade Offs and Opportunities, presented by Gabriele Rigon 

[2] Gartner Press Release, Gartner Says CIOs Must Balance the Environmental Promises and Risks of AI, November 2023, https://www.gartner.com/en/newsroom/press-releases/2023-11-07-gartner-says-cios-must-balance-the-environmental-promises-and-risks-of-ai. Gartner® is a registered trademark and service mark of Gartner, Inc. and/or its affiliates in the U.S. and internationally and is used herein with permission. All rights reserved.

Related posts

Advertisement

Latest posts