
Alibaba-Qwen has released Qwen1.5 32B, a new multilingual dense Large Language Model (LLM) that supports 10 languages, featuring a context size of 32k and outperforming Mixtral on the open LLM Leaderboard. The model, which includes both a general and a chat version, has been praised for its competitive performance against Mixtral, especially in language understanding, multilingual support, coding, and mathematical abilities. It is described as a mid-size model with 32 billion parameters, making it easier to run on-device while maintaining strong metrics across various benchmarks. Additionally, the Qwen1.5 series now includes Qwen1.5-32B and Qwen1.5-32B-Chat, available for commercial use under a custom license, with open weights for research and using DPO for preference training. In related developments, StabilityAI introduced Stable LM 2 12B, a pair of 12 billion parameter language models trained on multilingual data, covering English, Spanish, German, Italian, French, Portuguese, and Dutch. These models, which offer comparable performance to Mixtral, are available for open access and include both base and instruction-tuned versions.
Stable LM 2 - 12B ⚡ > Multilingual - English, Spanish, German, Italian, French, Portuguese, and Dutch. > Comparable performance to Mixtral. > Open access. > Base and Instruction tuned models released. > Instruction-tuned versions can be used for tool usage and function… https://t.co/hCiISRdgjM
Stable LM 2 - 12B ⚡ > Multilingual - English, Spanish, German, Italian, French, Portuguese, and Dutch. > Comparable performance to Mixtral. > Open access. > Base and Instruction tuned models released. > Instruction-tuned versions can be used for tool usage and function calling.…
Stable LM 2 12B is a pair of powerful 12 billion parameter language models trained on multilingual data in English, Spanish, German, Italian, French, Portuguese, and Dutch, featuring a base and instruction-tuned model. You can now try the model here: https://t.co/6QGjTkjgOc…


