OpenAI’s Sam Altman sparked a firestorm of criticism at the India AI Impact Summit 2026when he responded to concerns over artificial intelligence’s growing environmental footprint by comparing it to the lifetime energy and food a human consumes before reaching “useful” intelligence. “It takes like 20 years of life and all of the food you eat during that time before you get smart,” Altman said in a much-circulated interview clip, arguing that this makes comparisons between human and AI energy usage “unfair.” What sounded like a hastily crafted metaphor has instead exposed a deeper disconnect between some AI leadership and the public — one that frames human existence and resource use in the same utilitarian terms as server racks and silicon.
During a roughly hour-long session with The Indian Express, Altman defended the rapid expansion of AI systems and data centres by reframing the debate around energy consumption. When pressed about claims that a single ChatGPT query uses the equivalent of a smartphone battery charge, Altman dismissed such figures and shifted the focus to broader efficiency comparisons. “People talk about how much energy it takes to train an AI model, relative to how much it costs a human to do one inference query,” he said. “But it also takes a lot of energy to train a human.”
This line of reasoning collapses decades of human development, social interaction, education and embodied experience into a trivial equivalence with computational cycles. In Altman’s telling, the very evolution of humanity is something to be factored into the energy ledger of intelligence, as if centuries of cultural achievement were just another entry on a balance sheet.
Critics say this metaphor not only misses the point but dehumanises the very people AI is meant to serve. On social media sites such as LinkedIn, commentators labelled Altman’s comparison as detached and dystopian, dismissing meaningful concerns about energy, water and land usage surrounding data centre expansion. One popular post derided the idea that a baby’s growth and learning could be equated to training “a non-biological tool” with no intrinsic value beyond utility.
Behind the headline-grabbing analogy is a real, quantifiable problem that Altman glossed over. Modern AI infrastructures, particularly large data centres, require vast amounts of electricity and cooling resources. Estimates suggest these facilities already use energy on a scale comparable to millions of households and consume billions of gallons of water annually, figures that are projected to increase as models grow more complex.
No comments:
Post a Comment