







































Databricks has launched a new open-source large language model called DBRX, with 132 billion total parameters and 36 billion active parameters. It outperforms previous models like Llama 2, Mixtral, and Grok, setting new standards in the AI field. DBRX was developed over several months and cost approximately $10 million. The model is a mixture of experts (MoE) architecture and excels in various benchmarks, offering fast inference and a context size of up to 32K tokens.
DBRX by @databricks is a VERY impressive model. Test vid coming soon.
🎉 DBRX is available now for FREE on Clarifai! 🎉 We're excited to announce some big news heading into the weekend! @clarifai is now offering @databricks DBRX for free to all our community customers! #DBRX https://t.co/34ky3kxFau
🔥 DBRX by @databricks and Grok1 by @xai are now available for FREE at FEDML Nexus AI! We now offer the Playground, API access, and Private Deployment for the two most recent open-source foundational models by Databricks and xAI on @FEDML_AI Model Hub… https://t.co/uYKkt5N3RY