
How Much Water Does ChatGPT Really Use? Sam Altman Reveals the Truth
Surprising Insight: One ChatGPT Query Uses Just a Few Drops of Water!
OpenAI CEO Sam Altman recently shared a surprising insight into ChatGPT’s environmental footprint that could change how we think about AI’s resource usage.
According to Altman, each ChatGPT query consumes just 1/15th of a teaspoon of water — that’s around 0.000085 gallons, or just a few drops per interaction. While this might sound negligible, it adds an interesting layer to the growing discussion around the sustainability of artificial intelligence.
In his latest blog update, Altman also noted that the energy usage per query is about 0.34 watt-hours, which is roughly equivalent to running an oven for a second or a high-efficiency bulb for a couple of minutes.
AI models like GPT-4 rely on large-scale data centers that require constant cooling, often using both electricity and water. Past reports, like one from The Washington Post, suggested that generating even a short AI-generated response could consume more water than expected. The exact resource consumption depends on the data center’s location and cooling methods.
Altman’s latest claim, although lacking detailed methodology, appears aimed at easing concerns over the ecological impact of widespread AI use. With some experts warning that AI’s energy demands could outpace Bitcoin mining by 2025, the debate over AI sustainability is heating up.
⚡ Altman also shared a long-term vision: the cost of intelligence may eventually drop close to the cost of electricity, making AI more affordable and efficient for everyone.
So, while one question to ChatGPT may use only a tiny sip of water, the bigger challenge is ensuring AI remains sustainable as it scales globally.