A recent Microsoft paper has disclosed the model sizes of several major large language models (LLMs). The findings reveal that GPT-4 has approximately 1.76 trillion parameters, while GPT-4o is reported to have 200 billion parameters. Other models include Claude 3.5 Sonnet at 175 billion parameters, o1-preview at 300 billion parameters, o1-mini at 200 billion parameters, and GPT-4o-mini at 8 billion parameters. This information provides clarity on the relative sizes of these AI models, which have been a topic of speculation in the tech community.
The mystery’s cracking open — a new Microsoft paper just revealed the model sizes of major LLMs: 🧠 GPT-4o-mini: 8B 🧠 Claude 3.5 Sonnet: 175B 🧠 GPT-4: 1.76T (yes, trillion) 🧠 GPT-4o: 200B 🧠 o1-preview: 300B 🧠 o1-mini: 200B Finally, some real numbers behind the curtain. The https://t.co/DdWEQswBiX
Model size revealed: as expected, GPT-4 is about ~175b /~1.76t - 01 mini ~100b - Claude 3.5 ~175b - GPT-4o ~200b - GPT-4o-mini ~8b https://t.co/5bYiTfvs8v https://t.co/DrMXWZZiwr
Finally confirmed. > GPT-4: 1.76T > GPT-4o: 200B > GPT-4o-mini: 8B > o1-preview: 300B > o1-mini: 200B > Claude 3.5 Sonnet: 175B https://t.co/MaoYhCnXG4 https://t.co/AHWHWQYQET