Green Machine Learning: Is it Just Another Catchy Phrase?

Green Machine Learning: Is it Just Another Catchy Phrase?

Global CO2 emissions are increasing at an unprecedented rate, affecting all aspects of life. There is not just one cause but a wide range of factors, such as technological advancements and elevated energy consumption. The development of technology designed to facilitate progress also has the potential to impact the environment negatively.

Digitalization has been rising for the past two decades, leading to a substantial increase in carbon emissions through electricity usage and energy consumption. The energy consumption of digital technologies has been estimated to have increased by around 10% per year.

One technology prevalent in all sectors and disciplines is Machine Learning (ML). ML uses data and algorithms to make classifications and predictions, resulting in a trained model that aids decision-making. ML is heavily relied upon in almost all sectors due to its many real-life applications and resulting advancements in hardware and software.

However, developing complex and more robust algorithms in ML requires extensive large-scale computations, leading to substantial energy consumption and increased CO2 emissions. A study by the University of Massachusetts estimated that the CO2 emissions from training a single Natural Language Processing model were more significant than those from a vehicle over its lifetime. This should prompt researchers to consider “green” computational techniques.

The root cause for developing complex and resource-hungry techniques must be addressed to shift toward Greener ML. One reason for developing complex algorithms is the drive for accuracy, which has taken priority over energy-efficient ML algorithms. Secondly, the cost and complexity of developing, training, and running ML models must be investigated. Understanding the factors that increase computational complexity also provides opportunities for developing efficient solutions.

Academic programs offering ML specializations can take the first step in embedding energy efficiency considerations. Academics and researchers must be responsible for making environmentally conscious decisions while developing algorithms and applications. All stakeholders engaging with ML should aim to use a trained model for another application, reducing the number of computations needed.

ML complexity also influences hardware choice, making investing essential in developing efficient hardware. High-performance and energy-efficient hardware can complete tasks in a shorter time with less energy and electricity usage, reducing the carbon footprint.

The awareness for developing Green ML protocols is increasing, but much work is still to be done. All stakeholders must come together to propose green, efficient, and sustainable solutions to tackle the persistent challenges of climate change. Although extensive large-scale computations might be one of many significant contributors to CO2 emissions, every step counts towards achieving Sustainable Development Goals.

In conclusion, “Green Machine Learning” is not just a phrase but a significant research goal.

 

Dr Saad Aslam 
School of Engineering and Technology
Email: @email

 

This article was first published in Business Today on 22 October 2022.