The data centers powering @ChatGPT and other AI solutions use massive energy, waste water, and have a huge carbon footprint. As AI grows, we need to shift to software-driven systems like @DeepSeek to reduce environmental harm. Convenience isn't enough—let’s prioritize… https://t.co/CK7fPkQC3n
Bonne nouvelle, ChatGPT n’est peut-être pas aussi énergivore qu’on le pensait ➡️ https://t.co/RsCYyx0QZ8 https://t.co/LXkh36KYow
ChatGPT has reduced my use of google search ~75% over the past 6 months. And I probably query ChatGPT 2x more than I searched on Google pre AI. How about you?
Recent research indicates that ChatGPT's energy consumption is significantly lower than previously estimated. A study by Epoch AI reveals that a typical query using the GPT-4o model consumes approximately 0.3 watt-hours, which is ten times less than earlier assumptions. This energy usage is comparable to that of common household appliances, such as LED lightbulbs and laptops, which consume similar amounts of electricity over short periods. The findings suggest that ChatGPT's efficiency may align more closely with that of Google searches, thus alleviating concerns about the environmental impact of AI technologies. As AI tools like ChatGPT, Google's Gemini, Anthropic's Claude, and Meta's LLaMA become more prevalent, understanding their energy consumption becomes increasingly important for assessing their overall sustainability.