According to researcher Shaolei Ren from the University of California, ChatGPT and similar LLMs use up to 500 millilitres of water (17 oz) for every 20 to 50 prompts or questions users ask. In a paper published on Arxiv earlier this year, the researcher pointed out that 500 ml might not seem like much but with the combined use of ChatGPT by users globally, the water usage is enormous.
Talking to AP News, Ren was noted as saying:
‘It’s fair to say the majority of the [water] growth [in Microsoft’s 2022 environmental report] is due to AI. Its heavy investment in generative AI and partnership with OpenAI. Most people are not aware of the resource usage underlying ChatGPT. If you’re not aware of the resource usage, then there’s no way that we can help conserve the resources.’
Microsoft has acknowledged this issue when questioned by AP News and said that it’s researching ways to measure energy use and its carbon footprint. It also said that it’s looking for ways to make LLMs less energy-intensive.
The company told AP News that:
We will continue to monitor our emissions, accelerate progress while increasing our use of clean energy to power data centers, purchasing renewable energy, and other efforts to meet our sustainability goals of being carbon negative, water positive and zero waste by 2030.’
OpenAI has also acknowledged the water use issue and has said it’s looking for ways to make LLMs more energy-efficient.
While the amount of water being used to power LLMs is pretty staggering, it is only fair to point out that these modern LLMs are not that old so over time methods could be developed to reduce the water and energy usage.
For most people, the first interaction most people had with generative AI was ChatGPT which came out at the end of 2022. According to Gizmodo, ChatGPT has seen declining website visits for three consecutive months.
With declining numbers accessing ChatGPT, presumably, the water usage by the technology is declining too.