AI's environmental impact revealed by Mistral's latest sustainability tracker tool - a distressing account ensues
In the rapidly evolving world of Artificial Intelligence (AI), a growing concern is the environmental impact of Large Language Models (LLMs). These models, which include popular ones like LLaMA 3.1 and ChatGPT-3, require vast amounts of energy and water, leading to significant carbon footprints.
Energy and Carbon Impact
Training models like LLaMA 3.1 emits hundreds of tons of CO₂ equivalent—for instance, approximately 420 tCO₂ for one training cycle, comparable to decades of household electricity use. The annual usage of ChatGPT-3 required over 1,000 MWh, equivalent to running 100 gas cars nonstop. If trends continue, data centers supporting AI might use up to 4% of global electricity by 2030, with 56% of data centers currently relying on fossil fuels.
Water Consumption
Cooling data centers consumes large amounts of water. For example, LLaMA 3.1 training used around 2,769 kiloliters, equivalent to decades of average American household water use.
Transparency and Measurement Efforts
There is a recognized blind spot in reliably quantifying AI’s full environmental impact. Initiatives like the AI Environmental Footprint Measurement Hackathon aim to develop standardized ways to measure, monitor, and report carbon and water footprints of AI models to guide sustainable development.
Reducing Footprint Strategies
Research explores balancing model size and performance to improve efficiency, such as energy-aware code generation or using smaller, task-specific LLMs rather than large general ones when possible. AI can also contribute positively by optimizing data center operations and promoting smarter energy use.
Infrastructure Expansion and Sustainability Risks
To support AI growth, huge investments are driving an exponential increase in data centers worldwide. This expansion risks further resource consumption unless powered increasingly by renewables, as most data centers now still depend on fossil fuels.
Mistral's Approach to Sustainability
Amidst these challenges, companies like Mistral are taking steps to reduce their environmental impact. Mistral is building a data center in France to leverage low-carbon nuclear power and cooler climate for reducing the emissions of its models. The results of Mistral's efforts will be available via ADEME's Base Empreinte database, setting a new standard for transparency in the AI sector.
Mistral has also launched a new AI coding assistant targeted at security-conscious developers and a sustainability auditing tool for AI models. Furthermore, Mistral is advocating for greater transparency across the entire AI value chain and working to help AI adopters make informed decisions about the solutions that best suit their needs.
Other tech giants, such as Google and Microsoft, have also acknowledged the environmental impact of training and running AI models and are taking steps to reduce their footprint. However, significant challenges remain as AI usage and data center capacity continue to grow rapidly.
Without more transparency, it will be impossible for institutions, enterprises, and users to compare AI models and make informed decisions. The industry and researchers are actively working on measurement frameworks, efficiency improvements, and greener infrastructure to reduce AI’s environmental footprint, but it's a critical issue that requires ongoing attention and action.
[1] Strubell, E. et al. (2019). Energy and Policy Considerations for Deep Learning in NLP. arXiv:1909.08011 [cs.CL] [2] Schwartz, J. et al. (2020). Green AI: A Survey on the Energy and Carbon Footprint of AI. arXiv:2003.03076 [cs.CL] [3] AI Environmental Footprint Measurement Hackathon (2021). Retrieved from https://www.aiefmhackathon.org/ [4] Hutter, F. (2021). The Carbon Footprint of AI: What You Need to Know. Towards Data Science. Retrieved from https://towardsdatascience.com/the-carbon-footprint-of-ai-what-you-need-to-know-724350899d6f [5] Schwartz, J. et al. (2021). The Energy Consumption of AI: A Review. arXiv:2104.04032 [cs.CL]
- The current trend of data centers supporting AI might consume up to 4% of global electricity by 2030, raising environmental concerns, especially since 56% of these data centers still rely on fossil fuels.
- The water consumption of cooling data centers is considerable; for example, training LLaMA 3.1 used around 2,769 kiloliters of water, equivalent to decades of average American household water use.
- Transparency in measuring the environmental impact of AI models is essential for sustainable development, and initiatives like the AI Environmental Footprint Measurement Hackathon aim to develop standardized ways to measure, monitor, and report carbon and water footprints.