Artificial intelligence’s development has led to advancements like autonomous vehicles, virtual reality, and ChatGPT. One concern with these AI technologies, including their operation and training, is that AI models require a considerable levels of energy. In turn, this has increased concerns about the environmental impact of AI and its longer-term sustainability.
Putting AI’s energy usage into perspective: it took nine days to train one of OpenAI’s early model chatbots known as MegatronLM. According to TechTarget, during those nine days, 27,648 kilowatt hours of energy was used. This is the same quantity of amount of energy used by three typical homes over the course of a year (the average household uses 10,649 kWh annually, according to the U.S. Energy Information Administration).
How can AI be made more sustainable? Walid Saad, a professor in the Bradley Department of Electrical and Computer Engineering at Virginia Tech, is exploring the concept of ‘green federated learning’, or green FL. This is being undertaken in partnership with the company Amazon.
Saad is internationally recognized for his contributions to research on wireless communications (including 5G and 6G), artificial intelligence (AI), game theory, and machine learning.
Federated learning refers to a distributed machine learning technique that enables the deployment of collaborative AI algorithms. The approach, according to IBM, enables multiple actors to build a common, robust machine learning model without sharing data, thus addressing critical issues such as data privacy, data security, data access rights and access to heterogeneous data.
Saad is seeking to make federated learning systems, and more generally distributed AI systems, more sustainable and energy-efficient during both the training phase and inference phase, (which is when algorithms are used to execute real-world AI tasks).
Distributed Artificial Intelligence is an approach to solving complex learning, planning, and decision making problems.
Saad is of the view that by minimizing the energy expenditure of these algorithms and improving scalability to a larger number of wirelessly interconnected devices, the environmental impact of these technologies can be greatly reduced.
As he explains: “As more and more people adopt these types of technologies at scale (ChatGPT and Large Language Models being a case-in-point), it is imperative that we find ways to make them more sustainable, energy-efficient, and friendly to the environment.”
