OpenAI CEO Sam Altman has revealed that saying please and thank you to ChatGPT costs millions of dollars in energy.
He mentioned that being nice to the AI chatbot requires a lot of computation power, which ultimately raises electricity and water costs.
Altman in a cheeky online post noted that the polite prompt are actually expensive and impact the cost of energy.
Responding to a curious user on X (formerly Twitter) about the energy cost of polite prompts, Altman said, “Tens of millions of dollars well spent, you never know.”
tens of millions of dollars well spent--you never know
— Sam Altman (@sama) April 16, 2025
While his tone suggested he might have been half-joking, there's a serious edge to the statement.
Every polite ping to ChatGPT triggers a cascade of high-powered computations, each of which sucks up electricity like a thirsty data center.
Altman’s revelation may seem surprising, but it's grounded in the complex workings of AI.
ChatGPT doesn’t just get your message; it processes, interprets, and crafts a thoughtful, human-like reply in real time. That takes serious computing muscle, especially when you’re being nice about it.
According to Goldman Sachs, a single ChatGPT-4 query uses up to 10 times more electricity than a standard Google search.
And if just 10% of working Americans used ChatGPT weekly for a year, the energy used would equal what every home in Washington, D.C. consumes in 20 days.
Suddenly, that “kindly assist me” doesn’t sound so innocent, does it? Beyond the electricity, AI models also burn through water, literally.
To keep their colossal servers cool, data centres need water. A University of California study revealed that generating just 100 words on ChatGPT consumes about three bottles of water.
Even a short answer like “Everything is fine” guzzles nearly 44ml, or one shot of whiskey.