May 20, 2025
Trending News

The poor are concerned about AI’s “unquenchable thirst” for energy

  • April 10, 2024
  • 0

Large LLMs consume a lot of power. According to Arm, data centers will account for a quarter of all energy consumption in the United States by 2030. The

The poor are concerned about AI’s “unquenchable thirst” for energy

is not thirsty for energy

Large LLMs consume a lot of power. According to Arm, data centers will account for a quarter of all energy consumption in the United States by 2030.

The high energy consumption of AI has long concerned experts inside and outside the tech industry. The next person to sound the alarm is Rene Haas, CEO of Arm. In the Wall Street Journal, Haas talks about the “unquenchable thirst” for AI and especially for LLMs.

Unquenchable thirst

According to Haas, AI data centers now account for four percent of total U.S. energy consumption. According to the CEO, this share could rise to 20 to 25 percent by 2030. Training generative AI models is a very energy-intensive process, but it is primarily intended for everyday use (including the inference) This consumes energy in data centers. You can also take the “thirst” for AI literally, because in addition to a lot of electricity, a lot of water is also needed to keep AI systems active around the clock.

Haas is not alone with his pessimistic forecasts. The international energy agency IEA estimates that data center energy consumption has increased tenfold since 2022 due to the boom in AI applications. LLMs like GPT use up to ten times more energy than classic algorithms like those in Google Search. In “smaller” countries, using Ireland as a reference, data centers could require up to a third of the total energy supply.

Enter the data center

To meet the increasing digital needs of businesses and consumers, the number of data centers will continue to increase in the coming years. The key to sustainable AI seems to lie more than ever in the data center. Data center operators must look for ways to use energy more efficiently and, if possible, rely on sustainable energy sources.

But developers of AI systems also have the key in their hands. It is gradually becoming clear that it is no longer sustainable, that LLMs are becoming larger and require more parameters. A switch to smaller, more efficient models is necessary.

Source: IT Daily

Leave a Reply

Your email address will not be published. Required fields are marked *