Reducing Data Center Energy Consumption with Software Optimization Techniques

Energy : Reducing Data Center Energy Consumption with Software Optimization Techniques

In the fall of 2024, researchers introduced Perseus, an open-source tool designed to reduce the power consumption of training large language models like OpenAI’s GPT by up to 30%. This is achieved by optimizing the usage of GPUs used in AI training through efficient software management.

Recently, Canadian computer scientists discovered another simple method to significantly reduce the energy needs of data centers. This involves optimizing the processing of network packets, as reported by the University of Waterloo.

The current method of processing network packets in data centers is quite inefficient, according to the scientists. A minor change in the order of task execution could reduce power consumption by up to 30%. Martin Karsten, a professor of computer science at the University of Waterloo, likens this to optimizing a production process to avoid unnecessary movements. By restructuring the network packet processing, Karsten and his team significantly improved CPU cache usage.

This improvement requires changing only about 30 lines of Linux code. Linux is used in the data centers of all major tech companies. Companies like Amazon, Google, or Meta are cautious about making changes. However, the researchers have already succeeded in getting their adjustment included in the latest Linux kernel (6.13). Now, tech companies just need to activate this change, potentially saving several gigawatt-hours of energy worldwide.

Reducing the energy consumption of data centers could significantly lower the energy demand of individual internet services, such as Google searches or ChatGPT requests. The energy demand of server farms is expected to rise, especially due to AI advancements in the coming years.

A forecast by Goldman Sachs from May 2024 predicts that the “AI revolution” will increase the electricity consumption of data centers by 160% by 2030. Data centers are expected to require over 1,000 terawatt-hours of energy annually, up from around 400 terawatt-hours currently.

This development highlights the importance of finding efficient ways to manage energy consumption in data centers as the demand for AI and internet services grows. By implementing software optimizations and minor code changes, significant energy savings can be achieved, benefiting both the environment and operational costs.