Today is August 1st. "Coincidentally" very important information about OpenAI's GPT opensource model is leaked (120b/20b, MoE,... you can read all about it at @apples_jimmy ). And GPT-5 has also appeared several times, as can be seen in numerous screenshots (visit @legit_api https://t.co/jcpyTxfo5h https://t.co/KEZ3TdXreU
openAI leaked open-weight models: 120b and 20b, are solid in size especially their 20B model is exciting. now if it outperforms the new Qwen-3 30B, that would be a big deal and i suspect "Horizon-Alpha" could be the 20b — it fits the criteria in early tests: small and fast
Looks like the OpenAI open source model comes in two sizes; 20b and 120b. The release could be this morning. https://t.co/W9MMR6BbiA
OpenAI’s forthcoming open-source large language models briefly appeared on code-sharing platform Hugging Face early Friday before the repository was removed, according to multiple developers who downloaded the files. Two versions were exposed—a 120-billion-parameter and a 20-billion-parameter model—marking the first time full weights from the ChatGPT maker have surfaced outside the company’s own infrastructure. Configuration files seen by users indicate a mixture-of-experts design with 128 experts, four experts active per token, and support for BF16, FP4 and UE8 precision. The architecture resembles Meta-backed Mixtral, featuring GQA, SwiGLU and extended RoPE rotary positional embeddings across 36 layers. Observers say the smaller 20-billion-parameter model, code-named “Horizon-Alpha,” targets faster inference on commodity hardware. The leak fuels speculation that OpenAI is preparing to publicly release open-weight versions of its GPT line, a move that would shift competitive dynamics with open-source challengers such as Qwen-3 30B. OpenAI has not commented on the incident or provided a timeline for an official launch.