ICYMI - @huggingface & @aiworld_eu launched a daily data viz project highlighting trends and 🔑 insights from the open-source AI community. Today, we present the most liked and downloaded models -- signalling a shift toward the tiny but mighty models. 👀 🤏 📈 Space is linked… https://t.co/cNiDIXnUtJ
Perfect for re:Invent! Want to deploy @Alibaba_Qwen QwQ-32B, the best open reasoning model that rivals @OpenAI o1 and @AnthropicAI Claude Sonnet 3.5? 🤔 Excited to share a new guide on deploying Qwen QwQ 32B on @awscloud SageMaker using the @huggingface LLM Deep Learning… https://t.co/OVnIfPjKy8
Amazing models of the week: • Alibaba’s QwQ-32B • OLMo 2 @allen_ai • ShowUI by Show Lab, @NUSingapore, @Microsoft • MultiFoley @AdobeResearch • INTELLECT-1 by @PrimeIntellect 🧵 https://t.co/LLGyOhmN7K
Alibaba has introduced QwQ, an open-source AI model with 32 billion parameters, designed to compete with OpenAI's o1 in reasoning capabilities. The model supports a 32,000-token context, enhancing its utility in various applications. In addition to QwQ, the open-source AI community has seen the release of several notable models, including SmolVLM by Hugging Face, a vision-language model, and ShowUI-2B, a model for GUI/web automation agents. Other significant models mentioned include OLMo 2 from Allen AI, MultiFoley from Adobe Research, and INTELLECT-1 by Prime Intellect. The trend highlights a growing interest in smaller, efficient models within the AI landscape, as indicated by a new daily data visualization project launched by Hugging Face and AI World EU, which tracks popular models in the community.