
Magic AI has unveiled its latest model, LTM-2-Mini, featuring a groundbreaking 100 million token context window. This development significantly surpasses the previous leader, Google DeepMind, which had a 10 million token context window. The new model can process up to 750 novels' worth of text or 10 million lines of code, marking a substantial advancement in AI's ability to handle complex tasks such as code synthesis. The extended context window is expected to enhance the capabilities of autonomous AI agents, allowing them to ingest and understand large volumes of information more effectively. Magic AI's achievement is poised to transform how AI models manage and interpret extensive datasets, paving the way for more sophisticated AI applications.


Ok so this is 🤯 From an #AI startup that many people have never heard of. 'LTM-2-Mini is our first model with a 100 million token context window. That’s 10 million lines of code, or 750 novels.' Full blog: https://t.co/FrNnrQGgEe It could remember every email I ever wrote.
Magic has trained their first model with a 100 million token context window. That’s 10 million lines of code, or 750 novels.🤯 📌 They said: "With context solved, we now focus on unbounded inference-time compute as the next (and potentially last) breakthrough we believe is… https://t.co/dw1xCUuyto
Understanding the 100 Million Token Window 🧠 - 1 token can be roughly thought of as one word or a small chunk of text. - 100 million tokens are about 10 million lines of code or 750 novels. To make it even more relatable: - If we assume an average book has about 100,000 words… https://t.co/uMO56q3jJw https://t.co/0zIVDSc6Jf