BBS:      TELESC.NET.BR
Assunto:  Altman rejects viral water claims but...
De:       Mike Powell
Data:     Mon, 23 Feb 2026 13:05:38 -0500
-----------------------------------------------------------
Sam Altman says ChatGPT water use claims are 'completely untrue' - but admits
AI energy use is a concern

By Graham Barlow published 1 hour ago

Altman rejects viral water claims - but admits AI's energy footprint is
only getting bigger

    Sam Altman dismisses claims about ChatGPT's water usage as "totally
fake"
    Experts warn that scaling AI infrastructure is driving huge costs and
increasing pressure on power, cooling, and resources
    The real issue isn't efficiency - it's whether AI can grow at this
scale without serious environmental impact

Speaking at an event hosted by The Indian Express, OpenAI CEO Sam Altman
dismissed claims that AI's water usage is high as "totally fake", but he
did acknowledge that it had been an issue in the past when "we used to do
evaporative cooling in data centers."

"Now that we don't do that, you see these things on the internet like,
`Don't use ChatGPT, it's 17 gallons of water for each query' or
whatever," Altman said. "This is completely untrue, totally insane, no
connection to reality."

You can find this segment at around 27 minutes in the video of the event:

Sam Altman Unfiltered: ChatGPT, AI Risks & What's Coming Next, 40 Questions
in 60 Minutes: 
https://youtu.be/qH7thwrCluM

Altman did concede that concerns around AI's overall energy consumption are
"fair", noting that "the world is now using so much AI" and that "we
need to move towards nuclear or wind and solar very quickly".

AI-specific data centers already leave a larger and more complex footprint than
traditional facilities, and several groups have raised concerns about their
environmental impact - particularly around rising electricity demand, water
usage, and the construction of new infrastructure. That build-out is also
having knock-on effects, including increased demand for components like RAM,
which is pushing up prices across the industry.

IBM CEO Arvind Krishna has previously raised doubts about whether the current
pace and scale of AI data center expansion is financially sustainable. He
estimates that equipping a single 1GW site with compute hardware now costs
close to $80 billion - and with plans for nearly 100GW of capacity dedicated
to advanced AI training, the total potential spend could approach a staggering
$8 trillion.

Meanwhile, AI's new wave of ultra-powerful accelerators is pushing data
centers breaking point, forcing a rethink of power, cooling, and connectivity.
Hardware that felt cutting-edge just a few years ago can't keep up, as modern
AI workloads demand a complete overhaul of everything from rack design to
thermal strategy.

As well as dismissing claims about ChatGPT's water usage, Altman also offered
a more unusual defense of OpenAI's overall energy use. He argued that
discussions around AI's energy consumption were "unfair" because they
don't account for how much energy it takes to train humans to perform similar
tasks.

    "It also takes a lot of energy to train a human."
    Sam Altman, CEO OpenAI

"But it also takes a lot of energy to train a human," Altman said. "It
takes like 20 years of life and all of the food you eat during that time before
you get smart. And not only that, it took the very widespread evolution of the
100 billion people that have ever lived and learned not to get eaten by
predators and learned how to figure out science and whatever, to produce
you."

He continued: "If you ask ChatGPT a question, how much energy does it take
once its model is trained to answer that question versus a human? And probably,
AI has already caught up on an energy efficiency basis, measured that way."

I can see the argument Altman is making - that human intelligence also comes
with an energy cost - but it feels reductive, and faintly cynical, to reduce
the value of a human life to its energy consumption. More importantly, it
sidesteps the real issue. The question isn't whether humans also use energy
(of course they do!) but whether scaling AI to billions of daily queries
introduces entirely new levels of demand that we haven't had to account for
before. Comparing the lifetime energy cost of a human to the marginal cost of
an AI response might be provocative, but it's not especially useful.

What Altman's comments highlight is a growing tension at the heart of the AI
boom. The technology may be getting smarter and more efficient, but the scale
at which it's being deployed is growing even faster, raising fresh concerns
about its long-term environmental impact, including pressure on global water
supplies. The UN has already warned that the world has entered an "era of
global water bankruptcy," underlining just how fragile those resources have
become.

Those questions aren't going away. As AI adoption accelerates, the real
challenge won't just be how efficient the technology becomes, but whether it
can scale sustainably at all.


https://www.techradar.com/ai-platforms-assistants/sam-altman-says-chatgpt-water
-use-claims-are-completely-untrue-but-admits-ai-energy-use-is-a-concern

$$
--- SBBSecho 3.28-Linux
 * Origin: Capitol City Online (1:2320/105)

-----------------------------------------------------------
[Voltar]