Hugging Face has announced the release of Sentence Transformers v3.3.0, marking a significant update in natural language processing (NLP) capabilities. This new version introduces features aimed at enhancing efficiency and addressing performance bottlenecks in various applications. Notably, it supports the training and loading of LoRA adapters for embedding and ranking models, which is expected to benefit domain-specific retrieval-augmented generation (RAG) applications. Additionally, the update includes full support for NanoBEIR, a collection designed to expedite the evaluation of embedding models, reducing evaluation time from days to minutes while maintaining performance. Users have reported improvements in metrics such as NDCG@10, with enhancements observed through the use of prompts in training embedding models. Overall, the update is seen as a major leap forward for Hugging Face’s NLP tools.
Sentence Transformers v3.3.0 now allows you to train embedding models with prompts. In my quick tests, this improves my NDCG@10 by 0.66% and 0.90% - just by prepending "query: " and "document: " in front of your queries & docs. Info in 🧵 https://t.co/guGD07uWL2
The latest release of Sentence Transformers adds full support to NanoBEIR, our collection of smaller versions of the BEIR collection! We created it as a tool to speed up the evaluation of embedding models, dropping evaluation time from days to minutes while keeping a good… https://t.co/SDfYQGGDz4
TIL: With the release of `sentence-transformers==3.3.0`, you can now train and load LoRA adapters for embedding or ranking models! 👀 This looks very promising and awesome for domain-specific RAG applications! 🚀 https://t.co/7vjo0MU0UD