Advertisement

How Having Bigger AI Models Can Have A Detrimental Impact On Environment

How Having Bigger AI Models Can Have A Detrimental Impact On Environment

The COVID crisis has skyrocketed the applications of artificial intelligence — from tackling this global pandemic, to being a vital tool in managing various business processes. Despite its benefits, AI has always been scrutinised for its ethical concerns like existing biases and privacy issues. However, this technology also has some significant sustainability issues – it is known to consume a massive amount of energy, creating a negative impact on the environment.

As AI technology is getting advanced in predicting weather, understanding human speech, enhancing banking payments, and revolutionising healthcare, the advanced models are not only required to be trained on large datasets, but also require massive computing power to improve its accuracy. Such heavy computing and processing consumes a tremendous amount of energy and emits carbon dioxide, which has become an environmental concern.

According to a report, it has been estimated that the power required for training AI models emits approximately 626,000 pounds (284 tonnes) of carbon dioxide, which is comparatively five times the lifetime emissions of the average US car. This highlights the magnitude of the environmental problem training these AI models can create. Alongside, experts believe that these numbers are only the baseline value; the model, in actuality, tends to consume much more energy when it gets in the real-world. 

Also Read: How COVID Pandemic Highlighted The Limitations Of AI

How OpenAI’s Largest AI Model Can Have A Severe Impact On Environment

Open AI had recently launched their biggest AI language model — GPT-3, which has been designed to achieve accurate results “on a set of benchmark and unique natural language processing tasks” for translating different languages, generating news as well as helping students with their competitive exams by answering assessment questions. With its new version of 175 billion parameters, GPT-3 can bring in some significant environmental issues. 

One of the primary reasons is the massive amount of energy that is being consumed by these newer bigger AI models. In fact, OpenAI, in their blog post has stated that “improvement in computing power is the key component for the AI progress,” and therefore, it would continue to exponentially increase with a 3.4-month doubling time. While the previous model, GPT-2 was trained on a dataset of 8 million web pages to gain accuracy, GPT-3, being one of the largest AI models, required even more fine-tuning datasets of thousands of examples, with more energy consumption and significant carbon emission

In fact, one of the most energy-consuming processes has always been training NLPs. In 2018, Google developed BERT (Bidirectional Encoder Representations from Transformers) which was pre-trained on 40 epochs with over 3.3 billion words to achieve best NLP performance. This was later overpowered by another model, which was trained on 32 billion words. Such extensive operations of training neural networks on billions and billions of words required machine translation, sentence construction and other lengthy benchmarking tasks, and thus, demand higher computing power and massive energy.

Apart from training the model, in order to make it accurate, it also undergoes thorough experimentation, which again yields a substantial energy consumption. Testing the AI model is primarily done by trial and error methods to optimise the neural architecture. These tuning processes consume a lot of energy, adding to carbon emissions.

Once the model is trained and experimented on, it goes through the deployment process, which has separate energy requirements. In fact, according to NVIDIA, “80-90% of the cost of neural networks lies in inference processing,” where the capabilities learned by the AI model are put to the test in the real-world.

Another critical energy use issue is when data centres of recent era utilise a huge amount of electricity. According to an estimation, data centres usually use upto 200 TW of energy each year, which is way higher than the national energy consumption of many countries.

An AI researcher at the Allen Institute for Artificial Intelligence, Roy Schwartz once said that the larger the AI models get, the more energy they will consume. “If we continue this growth, we will see a much more significant negative impact on the environment.”

And, thus, it has been established that the more accurate the AI model is, with more parameters, the more energy it will consume, creating repercussions for the environment. Consequently, there is a growing concern of finding solutions that will consume lesser energy and produce lower carbon footprint while training, experimenting and developing these AI models.

Renewable AI Is The Solution

With AI continuing to advance without any guidelines, it is believed to have a more significant impact on the climate. In recent research, MIT researchers have noted that the most amount of energy that is consumed is usually done by Neural Architecture Search (NAS), which is a crucial technique to automate the architecting process of neural networks. 

However, things might change, going by the goals of the Intergovernmental Panel on Climate Change (IPCC) which has seen participation from tech giants like Amazon, Microsoft, Apple, as well as Google. The cohort have reportedly pledged to use renewable energy in the coming years.

Kara Hurst, Director of Sustainability, Amazon has stated in their company release that “With nearly 70 renewable energy projects around the globe – including 54 solar rooftops – we are making significant progress towards reaching Amazon’s company-wide commitment to reaching 100% renewable energy by 2030.”

Google, on the other hand, has also initiated agreements with renewable energy suppliers as well as using machine learning to reduce its energy consumption

In fact, in recent news, Tech Mahindra has also signed a joint declaration with UN Global Compact, along with 155 global companies, affirming its commitments towards zero carbon footprint and green economy. “COVID-19 has allowed all of us to reconfigure our priorities,” said, MD & CEO, Tech Mahindra, CP Gurnani. “We are committed to building a sustainable business with responsibility and by creating value for our stakeholders, while also keeping in mind the long-term impacts on the environment.”

Alongside, MIT researchers have also developed an automated artificial intelligence system for training and running central neural networks. The researchers claim that “by improving the computational efficiency of the system in some key ways, the system can cut down the pounds of carbon emissions involved — in some cases, down to low triple digits.”

Explaining further the researchers said that the system already encompasses many pre-trained subnetworks, which enhanced the training process of the larger neural network. According to research estimates, the system would require 1/1300 the carbon emissions compared to the traditional methods of NAS.

To avoid this impact, researchers should include in their report, the amount of computational cost of training algorithms, which would enhance transparency in the industry. In fact, to keep a count, a team of ML practitioners have developed a tool — Machine Learning Emissions Calculator — to help companies keep a check on the carbon emissions generated by the models we train, as well as ways to reduce those emissions. The system works on factors like energy consumed by the system’s hardware, training time, location of the data centre, per unit electricity used and associated carbon offsets, and estimated how much CO2 would be generated during these processes.

Another promising way of reducing carbon footprints is by choosing the right geographical location of cloud servers, and analysing if the local grid uses renewable or non-renewable sources. Such a decision can indeed make a huge impact on the carbon emission of the AI model. As a matter of fact, Alexandra Luccioni, an AI researcher with MILA, said “People often choose the server based on availability, or proximity, or personal preference, but choosing a low-carbon server in a location like Quebec or California can reduce the amount of carbon produced by a factor of 100.”

Tech giants can also assist companies with less computational resources by sharing their models. Such an initiative by tech giants will allow companies to build their models on top of the shared models, without emitting more carbon in building, training and deploying AI models from scratch.

Also Read: How Training AI Models In Simulated Environment Is Helping Researchers

Wrapping Up

The carbon footprint issue of AI is usually overshadowed by its accuracy factor, which is the primary reason for developing AI. However as the models are getting advanced, the cost of energy is getting bigger, which in turn increases the environmental impact as well. Considering AI is going to stay as well as further advance, it is critical to have an awareness of the climatic impact the AI model can have.

One of the essential steps is to create transparency in developing AI models. And for this, while publishing the research, it should be mandatory for them to mention the energy consumption as well as its performance metrics. Alongside, it should become the best practice for companies as well as researchers to create more efficient ways to train and deploy models for creating a sustainable future.

Download our Mobile App

Sejuti Das
Sejuti currently works as Associate Editor at Analytics India Magazine (AIM). Reach out at sejuti.das@analyticsindiamag.com

Subscribe to our newsletter

Join our editors every weekday evening as they steer you through the most significant news of the day.
Your newsletter subscriptions are subject to AIM Privacy Policy and Terms and Conditions.

Our Upcoming Events

Career Building in ML & AI

31st May | Online

31st May - 1st Jun '23 | Online

Rakuten Product Conference 2023

15th June | Online

Building LLM powered applications using LangChain

Jun 23, 2023 | Bangalore

MachineCon 2023 India

26th June | Online

Accelerating inference for every workload with TensorRT

MachineCon 2023 USA

Jul 21, 2023 | New York

Cypher 2023

Oct 11-13, 2023 | Bangalore

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox
MOST POPULAR