
Databricks has launched DBRX, a new open-source large language model with 132 billion total parameters and 36 billion active parameters. DBRX outperforms other models like Llama 2, Mixtral, and Grok, setting a new standard in the AI race. It was developed over several months and cost approximately $10 million. The model, based on a Mixture of Experts (MoE) architecture, offers high performance on various benchmarks, including programming tasks. DBRX is available for free on platforms like Hugging Face and Clarifai, enabling users to build and fine-tune custom LLMs.







































#DBRX is a general-purpose LLM that outperforms established open source models on standard benchmarks! We’ve open-sourced it. DBRX is incredibly efficient thanks to its fine-grained MoE architecture & offers the flexibility orgs need for custom #genAI. https://t.co/wXzxQOZym6
I've been testing DBRX and Mixtral head-to-head on basic improvised logic problems, and finding that DBRX is better on almost all of them. This might be pretty big for the LLM inference space - Mixtral has been the leader for a while on openrouter rankings, particularly for… https://t.co/5Y6rJesjFw
Announcing the new self-reported king of mixture-of-expert models: DBRX 132B by @databricks! It appears to be often better than Mixtral at reasoning and coding. Example and free-to-try playground 👇 https://t.co/ULbwlg0hhM