Have you ever imagined how much water your daily ChatGPT chats consume? I recently discovered a surprising insight shared by Sam Altman, the CEO of OpenAI: each single interaction with ChatGPT uses roughly 1/15 of a teaspoon of water. Whether it’s asking for math help, swapping lemon for lime in a recipe, or drafting an email, every little question you type represents a tiny drop of water being used behind the scenes. Considering OpenAI claims about a billion messages are sent to ChatGPT every day, and that’s just one AI among many like Gemini, DeepSeek, and Claude, it becomes clear that the AI revolution is incredibly thirsty.
It’s estimated that sending 10 to 50 AI queries may consume around 500ml of water — the same as a standard bottle of water.
But here’s the catch — some experts remain skeptical of Altman’s estimate. The actual water footprint depends heavily on the size of the AI models and the methods used to cool the enormous data centers power these systems. For medium-large models roughly the size of GPT-3, 10 to 50 queries can translate to about half a liter of water, factoring in everything from electricity generation to cooling processes. OpenAI has declined to share more detailed data, but what’s really fascinating is why this water consumption happens in the first place.
Why AI needs water: The hidden thirst of data centers
At first glance, AI might seem purely digital — just lines of code answering questions. But behind every prompt lies massive computational work happening in specialized supercomputers tucked away in gigantic data centers around the world. These machines run intense calculations to understand and respond accurately, which generates a lot of heat. To keep them from overheating, cooling systems become vital.
Traditional air cooling methods worked for earlier, less powerful servers. But with today’s AI hardware, the heat output is so substantial that liquid cooling techniques have become the norm. Here’s how it usually works: a coolant liquid flows over the processors, absorbing heat, which is then transferred to water in heat-exchange units. The water cools the coolant, but in the process, up to 80% of this water evaporates and is lost. This evaporated water has to be replenished from clean, often potable, water sources to prevent corrosion and bacteria— meaning precious drinking water is consumed in this cycle.
This water is not just a number — it matters because it comes from local ecosystems and water supplies used for irrigation, drinking, and sanitation. Around the world, communities in places like Spain, India, Chile, Uruguay, and the US have protested the strain that data centers place on water resources and electricity grids.
The bigger picture: Water in the AI supply chain and energy production
What’s even more eye-opening is that water’s role in AI is not limited to just data center cooling. The electricity powering these data hubs often comes from coal, gas, or nuclear sources, which themselves consume vast amounts of water to generate steam and run turbines. The International Energy Agency predicts that electricity demand for AI-optimized data centers could increase by 400% by 2030, potentially consuming as much electricity as the entire UK does annually.
Water is also essential upstream — in manufacturing the semiconductor chips that run AI models. Extracting and refining the raw materials, fabricating chips, and logistics are water-intensive processes. So, from creation to daily use, AI’s water footprint spreads across a complex supply chain.
While Google, Meta, and Microsoft report billions of liters of water usage at their data centers, none break down exactly how much is attributable specifically to AI workloads. However, many tech giants are acknowledging the issue and have pledged to become water neutral by 2030. There’s still a long road ahead.
Innovations and hopes for a less thirsty AI future
It’s encouraging to see the industry exploring ways to cut back water use. Some companies are trialing cooling technologies that don’t rely on evaporating water at all, and others are trying to repurpose the heat generated by servers to warm homes. Imagine turning AI’s byproduct heat to good use!
There are even avant-garde ideas like moving data centers underwater, to the Arctic, or someday off-planet entirely — perhaps into space on satellites performing backup or less intensive tasks. While these notions remain experimental and face significant hurdles, they show an eagerness to innovate beyond traditional constraints.
AI’s capabilities have exploded over a very short time, but the technology itself is still in its infancy. This means we have a unique window to rethink and build AI systems that prioritize sustainability. As a global society, the challenge will be to balance our thirst for powerful AI tools with the planet’s finite water and energy resources.
Minimizing water and energy use in AI is more than a tech challenge — it’s a critical step towards sustainable innovation that benefits everyone.
So next time you send a quick prompt to ChatGPT or an AI assistant, remember: that little click carries a surprising environmental cost. But it also carries potential — a chance to push for smarter, greener AI that serves us all without draining our planet’s lifeblood.



