Artificial intelligence (AI) is evolving rapidly, which is also increasing pressure on large technology companies to address their environmental and climate impacts. Because it is big: AI requires enormous amounts of energy and water.
Tech giants like Amazon, Google and Microsoft are promising to help tackle the climate crisis, but experts say the sector is not doing enough to curb rising electricity and water consumption. The International Energy Agency (IEA) predicts that electricity consumption from data centers, AI and cryptocurrencies will double by 2026.
Why does AI require more and more energy?
The development of AI requires more computing power, which is further increased by the development of new consumer products. In February 2023, Google announced the AI tool Bard, which aims to reach billions of users. And Microsoft is adding a button to all future Windows keyboards for its AI tool Copilot. “The power needs of search engines like Google could increase tenfold once AI is fully implemented in them,” the IEA report said.
According to the IEA, Amazon, Microsoft, Google and Meta more than doubled their combined energy consumption between 2017 and 2021, to around 72 terawatt hours (TWh) in 2021. That’s around a quarter of the total energy consumption in a country like the UK. Great Britain. Google’s so-called Scope 2 greenhouse gas emissions – the emissions from purchased electricity and heat – increased by 37 percent in 2022 compared to the previous year. According to a 2020 study by Lancaster University, the global ICT sector accounts for between 2 and 4 percent of total annual carbon emissions.
“Any way you look at it, if the sector wants to meet the broader climate commitments of the Paris Agreement, it needs to change course and reduce emissions,” writes Anne Pasek, a technology and climate researcher at Trent University in Canada. “This requires a change in standards and habits, both for consumers and industry players.”
How much water does Big Tech use?
The development of AI by the technology sector is also leading to a huge increase in water consumption. Model training involves feeding massive amounts of data into algorithms called Large Language Models (LLMs). They require enormous computing power and therefore very powerful hardware. Google’s latest environmental report states that the company will need around 21 billion liters of water by 2022. That is 4.9 billion liters more than in 2021 and 8.3 billion liters more than in 2020.
Google’s water consumption in 2022 is comparable to that of a city like London in ten days. Microsoft used almost 6.4 billion liters of water for its operations in 2022, an increase of about 34 percent compared to 2021. Just training GPT-3, the language model that runs ChatGPT from the company OpenAI Floating, used 700,000 liters clean fresh water in Microsoft’s US data centers. This emerges from studies by the University of California. “For a simple conversation with about twenty to fifty questions and answers, ChatGPT needs to “drink” a half-liter bottle of water, depending on when and where ChatGPT is used,” the researchers wrote. “These numbers are likely to increase many times the recently introduced GPT-4, which has a significantly larger model.”
Stocks are dwindling
The tech industry’s intensive water use comes at a time when global demand is rising and supplies are shrinking. The United Nations has predicted that water demand will exceed supply by about 40 percent by 2030. They estimate that the number of people living in water-stressed cities will increase from 930 million in 2016 to 1.7 billion to 2.4 billion in 2050. Google plans to build a data center in the Uruguayan capital Montevideo. But there are concerns about the impact on water use in a country already suffering its worst drought in 74 years.
A spokesperson for the search giant said the “data center project is still in the exploratory phase and the engineering department is actively working on a way to manage this with the support of national and local authorities.”
Possible solutions
AI use could decline once companies finish experimenting with new tools like ChatGPT, says Ayse Coskun, a technical engineer at Boston University. They can then determine which tasks require complex models and where simpler models are sufficient. “People are thinking about: Do I really need a sledgehammer to hammer a small nail?” Efficiency improvements and regulation will also be crucial to reducing global AI consumption, experts say. For example, the European Commission requires data centers to report their energy consumption and emissions.
China requires all government organizations working in technology to switch fully to renewable energy by 2032. The US Department of Energy, in turn, will fund the development of more efficient semiconductors (key components of computers). Still, more radical approaches may be needed to ensure tech companies grow in line with climate goals, experts say. “We must no longer view energy efficiency and a lower carbon footprint as ‘value-added’ but as a critical part of any computing system, especially for large data centers,” says Coskun. “Companies need to be held accountable, and we need more innovation and more transparent reporting to optimize the entire energy system.”