During the NVIDIA GTC conference, Jensen Huang, the CEO of NVIDIA, hinted at the scale of the latest OpenAI model, GPT-4, suggesting it has approximately 1.8 trillion parameters. This information, which seems to confirm the specifications of GPT-4, also referred to as GPT-MoE, has sparked discussions among technology enthusiasts and experts. The revelation, initially speculated upon, appears to have been inadvertently confirmed through Huang's statements and a presentation slide, marking a significant moment in the development of AI technologies.
This slide from the NVIDIA GTC conference seems to imply that Open AI GPT has a model 1.8 TRILLION parameters. https://t.co/mejA4Yt0f9
Did Jensen just leak GPT-4 parameters count?! "the latest OpenAI model has approximately 1.8 trillion parameters" I mean.. we already know.. But is it official now?
Jensen Huang effectively confirms GPT4 is 1.8 trillion parameters lol