French company Schneider Electric estimates that power consumption by AI workloads will be around 4.3 GW this year. This is slightly less than the total electricity demand of a country like Cyprus (4.7 GW) in 2021. According to Schneider, this electricity consumption will only increase. In fact, the company expects an average annual growth rate of between 26 and 36 percent.
Schneider Electric gives some numbers in a paper. For example, total power consumption in data centers will be no less than 54 GW in 2023, Tom’s Hardware knows. AI assignment workloads account for 4.3 GW. Twenty percent of this power consumption goes into training the AI models. The remaining eighty percent goes into inference tasks such as tasks in ChatGPT or other models. All AI-related tasks will account for eight percent of total data center power consumption this year.
Rising demand
According to Schneider Electric, this demand will only increase in the coming years. In the study, the company predicts that the total electricity consumption of data centers will be around 90 GW by 2028. AI workloads consume between 13.5 and 20 GW. A small calculation shows that the share of AI in consumption will grow to twenty percent. So that’s a significant growth where there will be a small shift in AI usage itself. Schneider assumes that by 2028, 85 percent of electricity demand will be accounted for by inference tasks.
This increased power demand is largely due to advances in AI GPUs and AI processors, as well as increasing demands of other hardware in data centers. The size of AI clusters, influenced by the complexity and size of AI models, is of course also an important factor for power consumption. Larger models require a larger number of CPUs. These clusters and CPUs often run at full capacity almost continuously during the training process. This means that the average energy consumption almost corresponds to the peak energy consumption. According to Schneider’s paper, network latency also plays a role. The lower the delay in the network, the more energy intensive the infrastructure is.
Cooling is a big challenge
Schneider Electric points out the other side of the coin. More power in data centers irreversibly increases the need for more cooling. Effective cooling solutions are essential to maintaining optimal performance and preventing disruptions. At the same time, air and liquid cooling systems are also “expensive” in terms of power consumption, which increases the overall energy requirements of data centers. This is precisely why Schneider does not assume that electricity demand will decrease in the near future.
The top French company makes some recommendations in the document. For example, Schneider Electric recommends switching to a 240/415V distribution instead of the usual 120/208V distribution. This should better accommodate the high performance densities of AI workloads. When it comes to cooling, Schneider recommends switching from air cooling to liquid alternatives to improve processor reliability and energy efficiency. According to the company, immersing it in special liquids produces even better results.