OpenAI CEO Sam Altman on Friday called concerns about data center water usage “bogus” and defended the resource demands of artificial intelligence, comparing the energy used by AI systems to that of humans.
Speaking to The Indian Express on the sidelines of the India AI Impact Summit, Altman was asked to address common criticisms of AI, such as energy and water consumption.
The CEO responded that claims circulating online that ChatGPT uses a gallon of water per query are “completely false and completely insane” and have “nothing to do with reality.”
Data centers traditionally use large amounts of water to cool electrical components and prevent them from overheating. Data center cooling technology promises to reduce consumption, but some new data centers no longer rely on water at all.
Still, even with improved efficiency, a report released last month by water technology companies Xylem and Global Water Intelligence predicts that rising computing demands will more than triple the amount of water withdrawn for cooling over the next 25 years, putting pressure on water systems.
While Altman dismissed concerns about water use, he said energy consumption remains a significant concern for AI. “Not per query, but overall, because the world is using so much AI… and we need to move very quickly to nuclear, wind and solar power,” he said.
When asked about my previous comment, microsoft Founder Bill Gates suggested that the efficiency of the human brain proves AI can evolve to become more energy efficient over time, but Altman pushed back.
“One thing that’s always unfair about this comparison is that people argue about how much energy it takes to train an AI model…but it also takes a lot of energy to train a human,” he said. “It takes 20 years of your life and all the food you’ve ever eaten to become wiser.”
“A fair comparison is, if you ask ChatGPT a question, how much energy does it require after the model is trained to answer that question compared to a human? Perhaps AI has already caught up on an energy efficiency basis measured that way,” he added.
The process Altman is referring to is known as inference, which refers to the use of AI models that have already been trained to create new outputs. AI inference typically consumes much less power than the training itself.
Altman’s comments, particularly the comparison between AI and humans, sparked debate online amid growing concerns about AI’s ability to replace human jobs.
Sridhar Vembu, co-founder and chief scientist at Indian software company Zoho, who attended the summit, criticized the equivalence between humans and AI. “I don’t want to see a world where we equate some piece of technology with a human being,” the billionaire said in the X post.
The debate comes as governments and companies pour billions of dollars into new data centers to support the computing needs of AI systems.
According to a May report from the International Monetary Fund, global data center power consumption in 2023 had already reached levels comparable to Germany and France shortly after the launch of OpenAI’s groundbreaking ChatGPT AI model.
In response, some governments are working to speed up approval processes for bringing new, cheaper energy online, with some environmentalists warning such moves could clash with global net-zero goals.
Some communities in countries such as the United States are also delaying development projects over concerns that they will strain the power grid and raise overall electricity costs.
Last week, the city council in San Marcos, Texas, rejected a proposed $1.5 billion data center project after months of public opposition.
Amid this backlash, many technology leaders, including OpenAI’s Altman, argue that data centers will require more energy production from a variety of sources, including renewable and nuclear energy.
