Artificial intelligence (AI) and machine learning are no longer new concepts, but the discussion about AI’s energy consumption is more relevant than ever. AI’s rapid development and spread have the potential to fundamentally change many aspects of life and business. In the IT and software sectors, AI systems are increasingly used to handle complex tasks more efficiently and quickly. However, operating many modern AI models requires enormous computing power, which is not only costly but also leaves a significant ecological footprint.
On the other hand, AI offers promising opportunities that go far beyond mere efficiency improvements. Companies and research institutions are continuously working to make AI technologies more sustainable and to develop new applications that offer both economic and ecological benefits. For example, AI can contribute to optimizing energy systems, which not only reduces costs but also helps the environment.
This article explores the various aspects of sustainability in the context of AI. It examines both current challenges and negative aspects as well as potentials and positive developments. Finally, it shows how companies can implement sustainable processes through intelligent use of AI, thus making an important contribution to the energy transition.
Energy Consumption: What Does AI Cost Us?
An impressive example of AI systems’ energy consumption is the comparison between ChatGPT and traditional search engines like Google. According to a report, ChatGPT requires about 25 times more energy than Google’s search engine to perform similar tasks, such as answering simple questions.
This discrepancy is explained by the fundamental differences in technologies. While search engines like Google are based on relatively simple algorithms that rely on indexing and matching search queries with existing data, modern large language models (LLMs) like ChatGPT use highly complex neural networks. These networks, consisting of billions of parameters, require enormous computing power during training and execution.
Training a large language model like OpenAI GPT, which ChatGPT is based on, requires vast amounts of data and months of calculations on specialized high-performance computers, such as GPU or TPU clusters, which continuously consume energy during training. This energy demand can be managed by the frequency of training (model updates, internal tests, etc.).
Another crucial factor is inference, the phase in which the trained model generates responses or performs tasks. Significant computing power is also required here to perform complex calculations with minimal latency. The larger the model, the more energy it generally needs. Since ChatGPT is based on advanced language models capable of understanding and reproducing context and nuances in human language, the energy expenditure is correspondingly high. This expenditure occurs whenever a query is sent to the systems. Factors like input length (context window, the maximum size of information to be processed in a prompt) are crucial.
The high energy consumption in operating LLMs has both ecological and economic implications. Companies operating such AI models must make substantial investments in infrastructure to provide and maintain the necessary computing power. Additionally, high electricity consumption leads to ongoing operating costs that can significantly increase over time. This makes the new AI models from the cloud particularly interesting for SMEs and individuals, as no own hardware is needed here.
Operating Costs
The operating costs for a system like ChatGPT are immense. According to estimates, the costs for operating ChatGPT amount to several million dollars per month. These costs consist of various components, including hardware, power supply, and maintenance. This leads to a potential loss of nearly 5 billion US dollars for OpenAI this year.
A large part of the operating costs for LLMs is attributed to the operation of the hardware infrastructure in data centers, which not only house the physical servers but also require comprehensive cooling and power supply systems to ensure continuous operation. In addition to energy costs, there are other operating costs, including maintenance and management of the network infrastructure.
Energy costs vary depending on location and energy source but are substantial in any case. Particularly in regions with high electricity prices, energy costs can make up a large portion of total expenses. This consumption will likely decrease per server over the next few years as new systems become more energy-efficient and require less power for the same performance (performance per watt).
However, this applies not only to the use of AI but to all cloud systems. Critics argue that this is the core problem and that the cloud trend itself leads to enormous energy consumption and costs. However, central data centers, which can be optimized in construction and operation (sometimes with their solar parks), are far more energy-efficient per rack unit than building and operating one’s physical data center infrastructure in small or medium-sized enterprises.
Potential of AI for Energy Saving in Companies
AI techniques can significantly contribute to energy savings in companies by optimizing processes and operations. An example is the intelligent control of energy consumers in production facilities and offices. By analyzing data on energy consumption and operations, AI can make predictions and provide recommendations for optimizing energy use. This can be achieved, for example, by adjusting operating times, automating energy consumers, or optimizing heating, ventilation, and air conditioning systems.
A practical example is the use of AI in the manufacturing industry. AI systems can monitor and adjust production processes in real-time to minimize energy consumption. Integrating sensors and continuous data analysis allows identifying and optimizing inefficient processes, leading to a significant reduction in energy consumption and operating costs.
Another application example is using AI in building management. Intelligent building control systems based on AI can significantly reduce a building’s energy consumption by optimizing lighting, heating, and air conditioning. These systems learn from the usage habits of residents through sensors and adjust the control of energy consumers accordingly to maximize comfort and minimize energy consumption.
The systems mentioned here require fewer data than a general language model, for example. Therefore, the models can be operated relatively cost-effectively and energy-efficiently. Depending on the type and size of the model, they can already be operated well on standard mid-range notebooks or desktop computers today.
Use of AI for Optimizing Power Grids
A particularly promising area where AI can contribute to sustainability is optimizing power grids. Integrating renewable energy sources like wind and solar energy poses new challenges for the power grid, as these energy sources are weather-dependent and therefore less predictable. AI can help improve the power grid’s stability and efficiency.
AI is already successfully used to support the integration of renewable energies and increase the power grid’s efficiency. AI systems can analyze large amounts of data in real-time to make predictions about energy generation and consumption. These predictions enable more efficient control of the power grid and better planning of energy distribution. For example, excess energy generated during high production from renewable sources can be stored in battery storage and fed back into the grid during high demand.
Furthermore, AI can help prevent power grid failures by early detecting anomalies and potential problems. Continuous monitoring of the power grid allows identifying weaknesses and taking preventive measures to ensure the network’s reliability and stability.
Using AI to Save
Using AI for energy saving can offer companies several advantages. First, optimizing operations and processes can achieve significant cost savings. Energy is one of the largest operating expenses in many industries, and even small improvements in energy efficiency can lead to significant financial savings. The numbers here vary widely and realistically range from 9 to 20 percent in office, clinic, or factory operations. Independent, reliable values are currently being determined through various research projects. Companies investing in AI-based energy management systems can reduce their operating costs and increase their competitiveness.
Second, reducing energy consumption also improves a company’s environmental balance. Lower electricity demand means fewer fossil fuels are burned in the energy mix, reducing CO2 emissions. However, it is important to carefully weigh costs and benefits.