European utilities turn old power plants into AI data centers - if only they also had the power these plants provided available to them - that would be even better. It does, however, make sense to do this. https://t.co/oZdrTVA0Sj https://t.co/9MwoKkDqow
Amazon - the biggest constraint for AI and cloud computing is power! https://t.co/HX5kdhP07G
U.S. DATACENTER POWER DEMAND SET TO DOUBLE BY 2028 And with great power comes great volumes of AI Chips from $NVDA $AVGO $AMD and $MRVL. In terms of energy and grid… Bullish for next-gen nuclear & grid plays: • $CCJ uranium supply • $CEG carbon-free baseload • $VRT power
Google said on 4 August it has entered into its first formal demand-response agreements with Indiana Michigan Power and the Tennessee Valley Authority that allow the utilities to ask the company to curtail electricity use at selected U.S. data centers when power demand on the grid spikes. The arrangements cover energy-intensive machine-learning tasks and other non-essential AI workloads, which Google can pause or reschedule. The company said the flexibility will help connect new data centers more quickly, reduce the need for additional transmission lines or power plants, and give grid operators more room to manage heat- or weather-driven surges in consumption. Demand-response programs are common in heavy manufacturing and cryptocurrency mining but are still rare in the technology sector. Financial terms were not disclosed, and Google did not specify which facilities are covered, but it noted that the agreements represent a template it may replicate in other regions as its AI infrastructure expands. U.S. utilities face mounting strain from the rapid growth of cloud and AI computing. S&P Global Market Intelligence projects data-center electricity demand will rise to about 58 gigawatts in 2025 and almost double to 95 gigawatts by 2028, intensifying pressure on grids and prompting tech companies to seek new ways to manage their loads.