27% of companies check the energy consumption of an AI model before making a selection. For more than half of the companies, performance is more important than measuring environmental impacts when it comes to artificial intelligence. This is according to a study by the IT consulting firm Capgemini. At the same time, 48% of executives believe that AI has increased their company’s greenhouse gas emissions. Companies that measure their environmental impact expect an 85% increase in the next two years.
Environmental impacts are secondary for companies when choosing AI models, focusing mainly on performance. More than three-quarters consider performance among their top five selection criteria, just ahead of scalability. Cybersecurity, cost, and efficiency are also important, with more than half ranking them in the top five. In contrast, only 20% consider ecological impacts important. However, 38% of executives are aware of the environmental consequences of using AI.
So far, 42% of decision-makers have had to rethink their company’s environmental goals. Yet, only about 10% of companies currently review the environmental impact of using artificial intelligence. However, over 80% of executives plan to conduct such reviews in the future. About half plan to do so within the next 12 months, while the other half anticipate implementing reviews within 24 months.
Three out of four respondents point to the lack of transparency from AI manufacturers regarding ecological consequences. Two-thirds also see themselves as responsible and recognize a lack of awareness at the executive level. They also note that measuring environmental impacts is challenging. An emissions calculator for AI models could help them. Nevertheless, a third of companies already plan to integrate sustainability measures into the AI lifecycle. Half are consciously opting for smaller language models and renewable energy or plan to do so in the coming year.
AI is expected to double the energy consumption of data centers. Compared to a regular Google search, the energy consumption of queries in ChatGPT 4 is ten times higher. According to the Capgemini study, OpenAI’s training of the language model required between 51 and 62 gigawatt-hours of electricity. In the case of Google’s machine learning program, the authors estimate a 60% energy consumption for inference and 40% for training the AI.
Referring to figures from the International Energy Agency, researchers expect the global energy demand for operating data centers to more than double from 460 terawatt-hours in 2022 to about 1000 terawatt-hours by the next year. The water consumption for IT infrastructure in the Data Center Valley in Virginia, USA, increased by 69% between 2019 and 2023. Approximately one liter of water is needed for cooling for 40 to 100 queries.
For the study, Capgemini surveyed 2000 executives from companies in 15 countries, all with annual revenues of over one billion US dollars and using artificial intelligence. The authors selected executives at the director level and above, working in technology, innovation, or corporate functions such as finance, sales, or marketing.