AI's Carbon Footprint: How Tech is Impacting the Environment

21-08-2023 | By Robin Mitchell

While AI developments have rapidly developed, there is a growing concern that the energy consumption from AI could be negatively contributing to climate change. What challenges does AI pose with respect to energy, what data is available regarding energy usage by AI, and what solutions can help to reduce this energy toll?


What challenges does AI present with respect to energy?

In the last few years, AI has truly transformed modern society in a multitude of ways. Despite many believing that AI would rebel against its makers, destroy humanity, and conquer the world, in reality, AI has done nothing but help humanity in numerous ways. 

The development of tools such as ChatGPT is allowing workers to spend more of their time doing more meaningful tasks while simultaneously providing engineers with an extremely practical natural language processor. Such AI is also able to interpret commands from humans extremely well, going as far as being able to understand the context of a request.

AI has also been monumental in the development of predictive maintenance systems, whereby real-time data is analysed to try and identify anomalies that may indicate failing systems. These same systems are also being deployed in safety systems whereby vehicles in the future may even have the ability to anticipate dangerous situations before they even occur.

And yet, for all the benefits awarded by AI, there is a growing concern with its use; energy. As the climate crisis continues to worsen, expending massive amounts of energy is beginning to carry a social stigma. One prime example was cryptocurrency when at its peak; while some were getting rich off the scheme, others saw cryptocurrencies as a horrendous waste of energy that only made global CO2 emissions worse.

In the case of AI, as massive amounts of computation are required (when using off-the-shelf hardware such as processors and GPUs), the energy consumption from AI is tremendous. While this may not be entirely impactful when developing small-scale AIs (such as those used in predictive maintenance), AI such as ChatGPT, which services millions of people, can consume insane amounts of energy. 

As AI technologies continue to grow, energy consumption will only continue to increase, and unless the energy used to power AI can be renewably sourced, AI will likely become a major contributor to climate change in the coming decade.

This concern is further echoed by a report from The Guardian. It highlights that the environmental cost of AI is becoming a significant concern. The carbon footprint resulting from the energy-intensive processes of AI models is comparable to the emissions of some small countries. This raises questions about the sustainability of AI advancements in the face of global climate challenges.

What data is available regarding AI energy consumption?

While it can be hard to quantify the amount of energy consumed by traditional AI systems, researchers around the world have started to study AI energy consumption. 

For example, one figure that frequently comes up is that GPT-3, the predecessor to ChatGPT, has consumed a total of 1,287MWh, resulting in the emission of around 550 tons of CO2. Even though this figure may seem small, the massive increase in the use of ChatGPT compared to GPT-3 will likely see far more energy consumed. With regard to ChatGPT, it is believed that training the model consumed a total of 1,064 MWh of power and 260.42 MWh of energy per day. This figure will only continue to rise as ChatGPT becomes more ingrained in everyday life.

Furthermore, when considering that the next generation of GPT, called GPT-4, has 570 times more parameters, the energy consumption of GPT-4 will likely be orders of magnitude greater. However, the true energy consumption is yet to be published.

Other data published in imore suggests that training a large AI model can consume the equivalent energy of 120 houses for an entire year. However, once trained, the energy consumption when using the trained model is substantially less. 

What solutions can help reduce AI energy consumption?

The reason why AI consume so much energy comes down to the hardware used to run them. Simply put, the type of computation that AI requires (involving large matrices) is not suitable for running on typical CPUs. This is why many AI engines try to run on GPUs, as these are far more suited for large parallel computations.

However, even then, GPUs are not perfect, meaning that modern datacentres are highly inefficient at running AI engines. This is why neural processing units have been developed by semiconductor manufacturers with the prime purpose of executing neural nets. 

The efficiency of such processors is so staggering that they can be deployed in mobile processors and have little effect on battery life, all while executing complex AI algorithms locally. Thus, datacentres that aim to run large AI systems need to start considering a shift towards dedicated neural processors.

The other major factor in power consumption is cooling. It is estimated that around 40% of a datacenters power consumption comes from cooling. This is why numerous companies are exploring the use of installing data centres in cold climates as well as underwater to take advantage of cooler environments. 

Interestingly, the cooling issue can also be solved by neural processors. As these devices can efficiently run AI algorithms, they generate far less heat per inference compared to standard hardware, thereby reducing the amount of cooling needed.

Overall, AI is a major technology that isn’t going to be going away anytime soon, and so in order to minimise the future impact of AI, engineers will need to sta

To address these concerns, experts suggest several measures to curb the energy appetite of AI. One such recommendation, as highlighted in Medium, is the continuous refinement of AI models. As AI models like ChatGPT evolve, there's a potential for them to become more energy-efficient, thereby reducing their overall energy consumption and environmental impact. This approach not only ensures the sustainability of AI advancements but also aligns with global efforts to combat climate change.

Profile.jpg

By Robin Mitchell

Robin Mitchell is an electronic engineer who has been involved in electronics since the age of 13. After completing a BEng at the University of Warwick, Robin moved into the field of online content creation, developing articles, news pieces, and projects aimed at professionals and makers alike. Currently, Robin runs a small electronics business, MitchElectronics, which produces educational kits and resources.