OpenAI Chief Executive Officer Sam Altman has disclosed fresh details about the environmental footprint of the company’s flagship chatbot. According to Altman, a typical ChatGPT prompt consumes roughly the same amount of electricity as an oven running for one second and uses about one-fifteenth of a teaspoon of water for cooling. Altman’s figures translate to energy usage more than ten times that of an average Google search, underscoring the resource intensity of large-language-model queries. The data comes as ChatGPT, launched 925 days ago, continues to post record user growth—positioning the platform as a potential alternative to traditional search engines. Energy analysts warn that soaring demand from AI services could strain an already aging U.S. power grid. Some project that annual electricity growth needed to serve expanding AI workloads may outpace any increase witnessed since the late 1970s, highlighting the urgency of modernizing infrastructure and expanding clean-energy capacity.