Updated April 20th 2025, 17:48 IST
OpenAI CEO Sam Altman recently shared that being nice to ChatGPT such as saying 'please' and 'thank you', is not simply a matter of politeness, but also dollars.
When responding to a user on X asking how much electricity it costs users to be nice, Altman said, ‘Tens of millions of dollars well spent.’
Although it sounds laughable, Altman's reaction underscores a dire issue in artificial intelligence's energy use.
Simple sentences like ‘please’ or ‘thank you’ cost the AI extra input and work to respond with full-blown answers, adding computational workload. Individual transactions take but little power, yet multiplied by billions of inputs a day, it costs a lot.
Each question asked of ChatGPT-4 consumes approximately 2.9 watt-hours of power, almost 10 times the amount used to perform a Google search. With more than 1 billion questions per day, the systems of Openai burn almost 2.9 million kilowatt-hours of power each day.
AI is fueling global electricity consumption. The Electric Power Research Institute (EPRI) estimates that AI-fueled data centres may account for as much as 9.1% of US electricity consumption by 2030.
At the same time, the International Energy Agency (IEA) foresees AI and data centers accounting for more than 20% of advanced economy electricity growth at decade's end.
To help meet the growing energy demands of AI, Altman has put money into clean energy technologies such as Helion Energy (a nuclear fusion company) and Exowatt (a solar technology company).
OpenAI is also working on developing more energy-efficient data centers to power its increasing operations.
Saying "please" may look like a small thing, but in the world of AI, it's an expensive one — both to your wallet and the planet.
Published April 20th 2025, 17:48 IST