The ERNIE 4.5 series is now officially open source. This family of models includes 10 variants—from MoE models with 47B and 3B active parameters, the largest having 424B total parameters, to a 0.3B dense model—all available now to the global AI community for open research and https://t.co/HiQ2tqerpi
China’s biggest public AI drop since DeepSeek, Baidu’s open source Ernie, is about to hit the market https://t.co/Uy3ICtbvbl #OODA
Baidu’s open-source ERNIE 4.5 multimodal model was trained on 2,016 Nvidia H800s using Baidu’s open-source alternative to PyTorch called PaddlePaddle. Notable that this is yet another Chinese model trained on Nvidia GPUs but that it’s designed to also run on Huawei Ascend chips. https://t.co/kwu3mGjvYp
Chinese technology giant Baidu has officially open-sourced its Ernie 4.5 series of generative AI large language models (LLMs) and multimodal models, marking one of the country's largest public AI releases since DeepSeek. The Ernie 4.5 family includes 10 distinct variants, ranging from 0.3 billion to 424 billion parameters, with notable models featuring 47 billion and 3 billion active parameters in mixture-of-experts (MoE) architectures. Baidu released all pre-trained weights and inference code under the commercially friendly Apache 2.0 license, aiming to accelerate AI innovation and expand its user base globally. Benchmark tests indicate that Ernie 4.5 models outperform Baidu's previous DeepSeek v3 and are competitive with other leading open-source models such as Qwen 235B and OpenAI's GPT-4.1. The models were trained on 2,016 Nvidia H800 GPUs using Baidu’s open-source PaddlePaddle framework, with design compatibility for Huawei Ascend chips. This move represents a strategic shift from Baidu’s traditionally closed AI approach and contributes to the growing open-source AI ecosystem, particularly in China.