
Shanghai AI Laboratory has officially released InternLM 2.5, a new series of open Large Language Models (LLMs), at the WAIC 2024 event. The InternLM 2.5 series includes models such as the 7B, 7B Chat, and 7B Chat-1M, all featuring a context window of up to 1 million tokens and tool usage capabilities. The models are available on the Hugging Face hub and are free for commercial use upon request. Additionally, the InternLM-XComposer-2.5, a versatile Large Vision Language Model (LVLM) supporting long-contextual input and output with 24K interleaved image-text contexts and GPT-4V level capabilities with a 7B LLM backend, was introduced. The previous version, InternLM 2.0, recently achieved the highest score among seven LLMs, including ChatGPT-4, in China's College Entrance Math Test.



Shanghai AI Laboratory officially released the new generation of its LLM InternLM 2.5 at the #WAIC2024 today. The earlier 2.0 version recently achieved the highest score at China’s College Entrance Math Test among seven LLMs, including ChatGPT-4o. https://t.co/y3WVmPOmkR
Thanks @_akhaliq for tweeting our work! 🚀 We have released InternLM-XComposer-2.5 (IXC-2.5), a versatile Large Vision Language Model (LVLM) supporting long-contextual input and output. 🌊 Support 24K interleaved image-text contexts 🛠️ Versatile applications: - 🎞️ Video… https://t.co/6iYcmldkob
InternLM-XComposer-2.5 A Versatile Large Vision Language Model Supporting Long-Contextual Input and Output We present InternLM-XComposer-2.5 (IXC-2.5), a versatile large-vision language model that supports long-contextual input and output. IXC-2.5 excels in various text-image… https://t.co/PDFATUoeoT