The rising power demands of artificial intelligence (AI) and data centers are becoming a significant issue. Data center developers are addressing this by repurposing old coal plants into new data centers. AI is expected to drive a 160% increase in data center power demand. Emerson Electric's CEO highlighted that AI data center racks consume significantly more power than traditional data centers, with a single search on ChatGPT using 6 to 10 times the power of a traditional Google search. Some reports suggest that ChatGPT searches could use up to 100 times more power. The sustainability of generative AI models and their impact on the power grid is a growing concern.
Big Models Consume Tons Of Power - ChatGPT Requires 10x That Of Google Search These giant LLMs require massive amounts of power to train and infer. We don't discuss this as much as we should. This is one of the reasons why small models are so important. h/t zerohedge https://t.co/VuW76xrHeU
Saw this one making the rounds. 👇🏻 A ChatGPT search uses 10x more power than a traditional search. (I’ve heard up to 100x in various presentations) Interesting question we must ask ourselves… Are GenAI and Sustainability diametrically opposed? The challenges to the grid and… https://t.co/LWJQZ2B0CO
Emerson Electric CEO: "AI data center racks consume significantly more power than traditional data centers with a search on ChatGPT consuming 6 to 10 times the power of a traditional search on Google" $EMR $GOOG $GOOGL $MSFT https://t.co/op1dCEhhlg