"According to Goldman Sachs (GS), each #ChatGPT-4 query uses about 10 times more electricity than a standard #Google (GOOGL) #search" (the kind without the #AI-generated overview at the top of the results). #ethics #internet #sustainability #environment #business #tech #education https://t.co/E5P68ChAIS
More importantly, "any message to #ChatGPT, no matter how trivial or inane, requires the #AI to initiate a full response in real time, relying on high-powered computing systems and increasing the computational load — thereby using massive amounts of electricity." #ethics #tech https://t.co/UPU1NaIO6u
Dire « merci » à ChatGPT : un petit mot, un grand coût énergétique https://t.co/Kc564LgjuM https://t.co/7BJ55EFS88
The use of ChatGPT, an AI language model developed by OpenAI, has been found to consume substantially more resources compared to traditional internet searches. According to a report by Goldman Sachs, each ChatGPT-4 query requires approximately ten times more electricity than a standard Google search without AI-generated summaries. Additionally, on average, 20 ChatGPT queries consume about half a liter of water, reflecting the significant water usage associated with data center operations. The high computational demand for generating real-time responses to user inputs leads to increased electricity consumption, which translates into millions of dollars in operational costs for OpenAI. Experts have highlighted the environmental impact of AI systems, emphasizing the growing energy and water consumption as these technologies become more integrated into daily life.