
On July 18, multiple AI research teams released new language models on Hugging Face. Vaishaal's team introduced their DCLM models, which they claim are the best performing truly open-source models, featuring open data, open weight models, and open training code. SFResearch and other contributors have launched the 'Tiny Giant' xLAM-1B-fc and xLAM-7B-fc models, which are small, agentic models with mobile-ready, quantized versions. This release includes tutorials, scripts, and additional resources to facilitate their use. The xLAM-1B-fc models are part of a strategic focus on creating compact, powerful language models to make AI more accessible and impactful.
Today's release of xLAM-1B on HuggingFace exemplifies our strategic focus on compact, powerful language models. Another step forward in our mission to make AI more accessible and impactful for all. https://t.co/0aedzv0EhO
We just released our "Tiny Giant" models on Huggingface. Feel free to give them a try! #LLM #AI #MachineLearning HuggingFace: https://t.co/PEnPfAj4nK GitHub: https://t.co/mnA8fg7LUr Dataset: https://t.co/ZG5AKQFWnS Paper: https://t.co/uXMcjyM59W https://t.co/DAKpBLbknV
Exciting news! Our "Tiny Giant" xLAM-1B-fc is now available on Hugging Face! Dive in here: https://t.co/bXFu2nx373, our suite of agentic models (xLAM-1B-fc & xLAM-7B-fc), including mobile-ready versions, tutorials, and scripts. #LAM #LLM #AI #MachineLearning #HuggingFace https://t.co/LG4dpI6sQx






