
MAP-Neo, a fully open-source and transparent bilingual large language model (LLM) series, has been released by researchers from M-A-P, University of Waterloo, and Wuhan AI Research. The model, which includes up to 7 billion parameters trained on 4.5 trillion tokens, is designed to close the gap with closed-source models. The release includes a detailed 49-page paper covering various aspects such as the tokenizer, data preprocessing, model architecture, training, and fine-tuning. MAP-Neo is noted for its superior performance, surpassing LLaMA2 while slightly trailing Mistral. The model and its associated resources, including the training script, data, and checkpoints, are available on the Hugging Face hub. The release was published on May 31, 2024.













As of June 1st, @HuggingFace now boasts over 200,000+ public AI demos, known as "Spaces"! 🤗🎉: https://t.co/WZy0Hl2tJu 'Where should I start?' Each week, 8 Spaces are featured as the Spaces of the Week. Here are the 950+ featured Spaces since 2021: 🔎 https://t.co/LaEnDn6Q8O https://t.co/tlQENVIEpW
As of June 1st, @HuggingFace now boasts over 200,000+ public AI demos, known as "Spaces"! 🤗🎉: https://t.co/WZy0Hl2tJu 'Where should I start?' Each week, 8 Spaces are featured as the Spaces of the Week. Here are the 950+ featured Spaces since 2021 🔎 https://t.co/LaEnDn6Q8O https://t.co/DQemRj2GjV
As of June 1st, @HuggingFace now boasts over 200,000+ public AI demos, known as "Spaces"! 🤗🎉: https://t.co/WZy0Hl2tJu 'Where should I start?' Each week, 8 Spaces are featured as the Spaces of the Week. Here are the 950+ featured Spaces since October 2021 🔎… https://t.co/iVubT8AngP