Professionals from various industries are expressing concern about artificial intelligence (AI) systems and their water consumption, a worry that seems increasingly valid, especially given Sam Altman’s recent statements that concerns about data center water usage were fake.1 Research around AI and water use is relatively new, and different sources have different estimates with respect to how much water a single query needs. Quantifying the amount of water used for an AI query is challenging, and providing a nice, round number of how many milliliters of water a ChatGPT query uses may be an impossible task.
What is clear is that Altman’s statements about resource use lack nuance, but he is not alone in this when discussing AI systems and water requirements. Subtlety and clarity of language are imperative when evaluating the environmental impacts of an AI system. Nuance is needed, and the implications of technology leaders not having this nuance in conversations about AI resource consumption can be devastating to both the environment and organizations. There are several ways practitioners can get to the facts when credible sources have conflicting information.
Varying Estimates on Water Usage
While Altman says claims that ChatGPT uses gallons of water for every query are “completely untrue, totally insane,”2 an often-cited statistic is that AI systems use approximately 500 milliliters worth of water for each conversation with ChatGPT.3 However, the problem with both of these claims rests in their vagueness: what is within the scope of evaluating resource consumption? The entire life cycle, from building a data center and fabricating chips through operation will obviously appear to be more resource intensive than looking at a single query in isolation. To be fair, no query can exist without all the resources it takes to get the AI system up and running, but factoring in this resource consumption inflates the amount of water AI systems appear to use.
Much of the criticism of AI’s environmental impact revolves around water usage. Water is involved in AI systems in several ways:4
- Evaporative cooling — Water may be used in a cooling tower at a data center to help dissipate heat. Some of the water will evaporate, but remaining water may be recycled a few times, depending on water quality.
- Off-site water usage — This type of water consumption does not address the data center’s water use but rather the water needed by the power plants to supply electricity to data centers. The amount of water power plants need for each kWh may vary.
- AI chip and server manufacturing — This type of manufacturing requires ultrapure water, and discharged water from this process might contain toxic chemicals or hazardous waste. The recycling rate for this use is limited, and a single liter of ultrapure water can consume up to 4 liters of freshwater.5
When people talk about the amount of water that AI queries require, what scope are they considering? Is it just the water the data center uses? Is it also factoring in off-site water needs? Or are these questions considering chip and server manufacturing as well?
Additionally, different generative AI outputs may require different amounts of energy. Determining the energy required for a query necessitates considering the output. One study found that video generation is 30 times more costly than image generation and 2,000 times more costly than text generation with respect to energy spent.6 Any estimate of resource consumption must differentiate between various outputs as energy demands will vary.
There is not necessarily a “correct” way to frame conversations about AI resource usage. The argument can be made for considering any combination of the 3 water uses outlined, but anyone describing AI resource consumption must specify the parameters by which they are evaluating it. Not doing so leaves room for statistics to be twisted to suit a particular agenda. This applies to arguments supporting that AI does not consume much water or that AI consumes unfathomable amounts of water.
The Risk of Lacking Nuance
Leaders at AI frontier enterprises must have nuanced conversations about AI, including AI systems’ environmental impact and other disruptive impacts. AI provider executives that speak in absolutes and use broad statements, such as calling concerns about water usage insane, do a disservice to valid and complex worries about the environmental impact of AI.
Context is necessary when evaluating the resources data centers require, as it can affect the impact their resource consumption has. This context ought to inform data center locations, as additional water consumption could disproportionately harm regions already affected by water scarcity. A Bloomberg News report found that, in the United States, approximately two-thirds of new data centers built or in development since 2022 are located in areas already affected by high levels of water stress.7 Globally, approximately 40% of the world’s data centers are in areas of high or extremely high water stress.8
Many data centers are built in deserts because the dry climate may minimize damage and corrosion to servers and other electrical equipment.9 But these are also regions in which water may be scarce and where its rate of evaporation may be higher compared to more humid climates. Enterprises may find dry climates ideal to protect their data center equipment, but they should also weigh the impacts of increased water needs in regions where water access may already be stressed. When AI providers make decisions about where to locate data centers and how to quantify the resources consumed by their products, they must consider all of the resources AI systems require (e.g., data centers, off-site water usage, and, to a lesser extent, chip fabrication). The true extent of water consumption by AI systems can only be understood by accounting for all of its dimensions.
Practitioner Takeaways
AI resource consumption is still an emerging area of research, and it is challenging to discern the truth when different sources may have significant differences in their claims about AI and water use. However, there are several tips that can help determine the credibility of a source and contextualize information provided about a field that is still evolving:
- Verify evidence — What evidence is provided, what is the source of that information, and under what parameters was the data collected? For example, a popular statistic that casual use of ChatGPT uses one water bottle’s worth of water comes from a paper first published in April 2023.10 Because of data availability, the paper’s case study considers Microsoft’s data centers and GPT-3 model training and processing inputs. GPT-3 was trained on 499 billion tokens of web content.11 This training was likely very resource intensive, and while AI systems often learn from inputs, this subsequent training likely does not involve as many computing resources as initial training on nearly 500 billion tokens.
- Ensure source legitimacy — What is the source of the information? Bias is nearly impossible to avoid, and certain sources may have agendas. For example, a water conservation association will have aims that are quite different from OpenAI, so it is understandable that their estimates about resource usage will vary. Sources may not deliberately lie about resource consumption, but they may use evidence or a scope that makes data seem favorable to their objectives.
- Establish recency — The proliferation of generative AI is relatively recent, and adoption has grown considerably in just the past few years. Claims about data center water usage that are from 2023 may no longer be up to date and reflect the growing reliance on AI tools today that may impact water use. Recent data may not always be available, but in a field that is evolving rapidly, data that is a couple years old might not reflect today’s realities.
- Define context — When possible, try to contextualize statistics. One source estimates that large data centers can use up to 5 million gallons of water per day.12 But what exactly does that mean? How does that compare to how much water people typically use? The source goes on to say that 5 million gallons per day is equivalent to the water use of a town populated by 10,000–50,000 people. Putting large, abstract numbers into context can help gauge the impact of an initiative and contextualize numbers that might otherwise seem meaningless.
- Identify logical fallacies — Being able to identify logical fallacies is an imperative part of thinking critically about information. Altman’s statements about water usage compared training an AI model to training a child, stating, “…it also takes a lot of energy to train a human. It takes like 20 years of life, and all the food you eat before that time, before you get smart.”13 This is an example of a false equivalency fallacy, in which someone compares 2 things as though they are the same even though they are not. Raising a child and the resources people need to survive are considerable, but it can be argued that a human life is inherently more valuable than an AI system. AI systems are only valuable for their “intelligence,” while people are valuable for reasons beyond an individual’s intelligence.
Making a Value Judgment
It is not easy for enterprises to determine if AI resource consumption aligns with its values. Quantifying AI resource consumption is complex, and enterprises and individuals must weigh the costs and benefits of AI systems. The environmental impact that one enterprise deems acceptable may be unacceptable for another enterprise. Proponents of AI often compare data center water use to golf courses, but anyone who enjoys golfing might find golf course water use acceptable while data center water use unacceptable.14
Having an open mind and being receptive to new information is important, especially when evaluating a relatively nascent field. Organizational concerns about the environmental impact of AI may change over time as new evidence emerges, and continuous learning is necessary to have the most up-to-date information.
Endnotes
1 Butts, D.; “Sam Altman Defends AI Resource Usage: Water Concerns ‘Fake,’ and ‘Humans Use Energy Too,’” CNBC, 23 February 2026
2 Butts, “Sam Altman Defends AI”
3 Lo, L.S.; “AI Chugs a Bottle of Water Every Time You Chat With It,” Science Alert, 3 September 2025
4 Li, P.; Yang, J.; et al.; “Making AI Less ‘Thirsty’: Uncovering and Addressing the Secret Water Footprint of AI Models,” Communications of the ACM, 2025; Note: the authors have made numerous revisions to this paper since it was first published
5 Pine, M.; “Why AI's Water Problem Might Actually Be an Opportunity,” World Economic Forum, 14 January 2026
6 Chedraoui, K.; “Your AI Videos Use Way More Energy Than Chatbots. It's a Big Problem,” Cnet, 24 October 2025
7 Nicoletti, L. Ma, M.; et al.; “AI Is Draining Water From Areas That Need It Most,” Bloomberg, 8 May 2025
8 Pine, “Why AI’s Water Problem
9 McGovern, G.; “Why Do Data Centers Love Deserts?,” Gerry McGovern, 2 February 2025
10 Li, Yang, “Making AI Less Thirsty”
11 Thompson, A.; “What’s In My AI?,” LifeArchitect.ai, March 2022
12 Yañez-Barnuevo, M.; “Data Centers and Water Consumption,” Environmental and Energy Study Institute (EESI), 25 June 2025
13 Butts, “Sam Altman Defends AI”
14 When researching the benefits and costs of golf courses it became increasingly clear that all of the sources were golf associations or golf publications, illustrating why understanding the goals/biases of publications is imperative when considering evidence.
Safia Kazi, AIGP, CIPT
Is a privacy professional practices principal at ISACA. In this role, she focuses on the development of ISACA’s privacy-related resources, including books, white papers, and review manuals. Kazi has worked at ISACA for more than a decade, previously working on the ISACA Journal and developing the award-winning ISACA Podcast.