Sarvam AI has launched Sarvam-M, a 24-billion-parameter hybrid language model designed to perform strongly in mathematics, programming, and multiple Indian languages. The model supports eleven major Indic languages, including Bengali, Hindi, Kannada, Gujarati, Marathi, Malayalam, Odia, Punjabi, Tamil, and Telugu. Sarvam-M is an instruction-tuned derivative of Mistral-Small-3.1-24B and is available as an open-source model on platforms such as Hugging Face and OpenRouter. Despite its technical capabilities and positioning as a flagship sovereign AI model for India, Sarvam-M received only 334 downloads within the first two days of release, raising questions about the adoption and impact of AI initiatives in the country. The model's launch has sparked debate on the role and effectiveness of sovereign AI efforts in India, highlighting challenges in gaining traction even for advanced language models tailored to regional languages.
#TechWithMC | Sarvam-M: Inside India’s 'sovereign AI model' and the debate it sparked @brokebiker with more details👇 https://t.co/KhT9WjbdmY #AI #LLM #Startups #ArtificialIntelligence
Sarvam-M: Inside India’s 'sovereign AI model' and the debate it sparked By @brokebiker https://t.co/RFF1u8WcPZ
When @SarvamAI one of the first startups selected under the @OfficialINDIAai unveiled Sarvam-M, a 24B parameter hybrid LLM supporting 10 Indian languages, it aimed to be a landmark in sovereign AI. Instead, it stirred a storm. With just 334 downloads in two days, critics like https://t.co/w159xyWAzB