Artificial intelligence technologies made a huge development leap since the first tests in the 1950s. AI tools can summarize a book, write an article, drive a car for you, and even understand people’s emotions. Yet these ‘magic’ tools, and the huge amount of development that goes into them, generate huge energy consumption – and that is what causes concern.
Some popular AI models, including ChatGPT, are available for users worldwide at no cost; others can be used at affordable prices. But no AI tool is free, if we look at them from an energy-consuming perspective. These models undergo extensive training processes, often involving powerful hardware and vast datasets. The power usage during this training phase is significant, contributing to concerns about the carbon footprint of AI technologies.
AI, notably, consumes more energy compared to other computing forms. It is a fact that cannot be overlooked, as the world’s efforts are united in the pursuit of a more efficient energy system. For example, the electricity required to train a single model surpasses the annual consumption of 100 U.S. homes.
In 2022, Google revealed that machine learning made up approximately 15% of its total energy consumption over 3 years. When you calculate the energy use of the entire computer-based system sphere (mainly operated with the help of AI), it will make up one fifth of the world’s power demand by the end of 2030, contributing up to 23% of greenhouse gas emissions.
The lack of systematic data collection on AI's energy usage and broader environmental impacts makes this sphere more disturbing for the world’s sustainability. With the rapid development of AI technologies and the wide availability of such tools, we need enhanced transparency and tracking in this field, alongside prioritization of the most energy-efficient computing infrastructure and AI algorithms.
Acknowledging these concerns, researchers and developers are actively working to address the energy consumption of AI models. Techniques like model pruning, quantization, and more efficient algorithms aim to reduce the computational load and energy requirements during training. Additionally, advancements in hardware contribute to a more sustainable AI ecosystem.
International guidelines may also help ensure that the AI sector meets sustainable goals. Pursuing this and other aims, the European Union is discussing the world’s first rules on Artificial Intelligence – The European Union’s AI Act – to make AI tools safe, transparent, traceable, non-discriminatory, and environmentally friendly.
End-users also play a role in the sustainable adoption of AI technologies. Being mindful of energy consumption, and opting for platforms and services committed to green practices, can collectively contribute to a more sustainable AI ecosystem.
As consumers, making conscious choices about AI technologies encourages developers to prioritize energy efficiency in their innovations.
While AI technologies, like ChatGPT, bring unprecedented advancements to our life, and their full capabilities are to be found out, the environmental impact of their energy consumption cannot be ignored.
A joint effort from the AI community, industry leaders, and end-users is necessary to ensure that the benefits of AI innovation do not come at the cost of our planet.