
MistralAI announced the release of a new open-source AI model, Mistral 7B v0.2, at the SHACK15sf venue during a hackathon event. The new model boasts significant improvements, including a 32k context window, a Rope Theta of 1e6, and the elimination of the sliding window feature. This announcement was made amidst a series of AI-focused events, including collaborations and sponsorships by notable companies such as SnowflakeDB, Microsoft, and AirbyteHQ. The hackathon, co-hosted by Cerebral Valley and attracting 400 AI enthusiasts, also featured various activities, including a pre-party at Data Council Austin and the availability of GreptileAI energy drinks. Additionally, the new model comes with fine-tuning instructions, marking a significant step up from its predecessor, Mistral 7B.



New open-source release for the Mistral AI Hackathon Mistral 7B v0.2 Base: https://t.co/DFfVNq0FyY - 32k context window - Rope Theta = 1e6 - No sliding window This is the raw pretrained model behind Mistral-7B-Instruct-v0.2 Also, new fine-tuning repo: https://t.co/A6pwaRQUbk
New open-source @MistralAI release with fine-tuning instructions: https://t.co/IXkgFbBCvH
New @MistralAI release with fine-tuning instructions: https://t.co/IXkgFbBCvH