Google has published a technical paper claiming that a typical text request to its Gemini artificial-intelligence model consumes 0.24 watt-hours of electricity, emits 0.03 grams of carbon-dioxide equivalent and uses 0.26 millilitres of water—roughly the energy of nine seconds of television viewing. The company says software and hardware changes cut the energy footprint of a median Gemini prompt by a factor of 33 and its carbon output by 44 between May 2024 and May 2025. Energy researchers and consumer advocates contend that such per-prompt metrics understate the environmental burden of large-scale AI. Data-centre power demand—covering model training, inference and cooling—already accounts for about 4 percent of U.S. electricity consumption and is climbing quickly as generative-AI adoption accelerates. Utilities are beginning to pass those costs on to households. Chicago-area provider ComEd has warned of further bill increases, while a Wood Mackenzie survey shows U.S. utilities seeking rate rises 142 percent larger than last year to fund grid upgrades needed for AI and other electrification trends. The surge in demand is reshaping real-estate investment. Benzinga reports that U.S. spending on data-centre construction is growing so fast it is on track to eclipse traditional office projects for the first time, underscoring how the global race to power AI is filtering through to consumers, utilities and capital markets alike.
US Electricity Demand: #Breakout -follows a decade+ of stagnation -AI and hard-tech driving revival --also EVs & energy transition -best guess = it goes higher (...when Kardashev Type 1 🤔) https://t.co/re5KuS54we
AI use and data centers are causing ComEd bills to spike — and it will likely get worse. https://t.co/HNyBj13TZy https://t.co/MReQf2KbG5
🤔 How much energy does a single AI query use? And what happens when millions are made every second? Find out on this week's Reuters Econ World podcast https://t.co/4QC2ZECbXZ https://t.co/cxOyXDKHYW