OpenAI CEO Sam Altman recently stirred up conversation on social media after revealing that users saying “please,” “thank you,” or “sorry” in ChatGPT prompts are quietly driving up operational costs for the company. Responding to a user on X (formerly Twitter) who jokingly asked how much electricity OpenAI has spent on polite prompts, Altman replied: “Tens of millions of dollars well spent.” He added a cryptic note: “You never know.”

While Altman’s comment was light-hearted, it highlights a surprising reality — these polite habits, when scaled across millions of daily interactions, have a tangible energy cost.

Here’s why it matters

With ChatGPT’s usage now crossing 150 million weekly active users, each prompt — no matter how short — consumes electricity. A report by Goldman Sachs estimates that a single ChatGPT-4 query consumes around 2.9 watt-hours of electricity, about 10 times more than a typical Google search. Multiply that by over a billion queries per day, and the numbers become staggering — roughly 2.9 million kilowatt-hours of electricity daily.

The comment quickly drew reactions online. Some users suggested a quick fix — handling polite phrases with client-side code. Others joked that AI models could save energy by dropping questions at the end of responses.

Altman’s point, while amusing, underscores how seemingly trivial habits can lead to significant impacts at scale — especially in the energy-hungry world of artificial intelligence.