


Chatbot Arena update: the latest Jamba 1.5 Large/Mini from @ai21labs is now live on the leaderboard! Open weights. Novel SSM-Transformer architecture with long context window. Congrats @ai21labs on the strong open model release! https://t.co/AAiL2xX9DI https://t.co/1QUYTqlLPI
📣 Two new models have arrived in Vertex AI Model Garden: Jamba 1.5 Mini and Jamba 1.5 Large from @AI21Labs! Both models in the Jamba 1.5 Model Family feature a 256K context window + Mamba-Transformer architecture for speed and long context efficiency → https://t.co/j45TtPQuod https://t.co/oDSKs5iwQt
🔍 Explore @AI21Labs' Jamba 1.5, designed for diverse AI tasks like content creation and data insights. ➡️ https://t.co/l2Uaqe2gzz Using transformer and Mamba architectures, this MoE model ensures top efficiency and extensive context handling. Experience Jamba 1.5 API as…

AI21 Labs has announced the release of Jamba 1.5, a new family of open models featuring Jamba 1.5 Mini and Jamba 1.5 Large. These models utilize a hybrid SSM-Transformer architecture, providing significant advancements in efficiency, speed, and context handling. Jamba 1.5 models boast a 256K context window, making them particularly effective for long-context applications and agentic AI. The models are designed to support multilingual capabilities, structured JSON output, function calling, and document understanding. Jamba 1.5 is available on platforms like Vertex AI and is intended to rival other leading open large language models (LLMs) such as Llama and Mistral. The models are permissively licensed, allowing for broader use and innovation in AI applications. Arena Hard scores are 65.4 for Large and 46.1 for Mini, while MMLU scores are 81.2 for Large.