OpenAI, a leading artificial intelligence company known for its ChatGPT product, has begun early testing of Google's Tensor Processing Units (TPUs) as part of efforts to diversify its AI chip supply and reduce reliance on Nvidia GPUs and Microsoft data centers. While initial reports suggested that OpenAI had started renting Google's AI chips to power ChatGPT and other products, the company later clarified that it has no active plans to deploy Google's TPUs at scale. OpenAI remains committed to using Nvidia GPUs and AMD chips for its production needs. Additionally, OpenAI announced that it will start using Google Cloud infrastructure to support ChatGPT and its API in multiple countries, including the U.S., Japan, the Netherlands, Norway, and the UK. This move adds Google Cloud to OpenAI's existing list of cloud providers, which includes Microsoft, CoreWeave, and Oracle, to meet growing compute demands amid ongoing GPU shortages in the AI industry. The collaboration marks a strategic partnership between two major competitors in the AI space, with Google leveraging its cloud and hardware capabilities to serve OpenAI's expanding operational requirements.
i do not understand the huggingface economy. open models are not producing income, how are they buying the gpus? you can say oh they finetune and upload to the hub but who is buying the fine tuned tokens except unemployed users ? where is the liquidity in this ecosystem? https://t.co/AM0YiiagcZ
Nearly Every AI Unicorn Is Building on Google Cloud Read here: https://t.co/nyLYSvj62d https://t.co/KTGj4iiCRA
Welcome to the Injective Council @googlecloud. Google Cloud, which powers @gmail and @YouTube, joins us as a Founding Member. As a Fortune 10 company, that powers over 25% of internet webpages, we welcome Google Cloud's expertise as we build the future of finance together. https://t.co/OgJeDGaef7