Microsoft reinventa las reglas del juego: su IA ultraligera ya compite con DeepSeek y OpenAI usando solo tu CPU https://t.co/VyPACgOHwW
Microsoft released bitnet.cpp: A blazing-fast open-source 1-bit LLM inference framework that runs directly on CPUs. You can now run 100B parameter models on local x86 CPU devices with up to 6x speed improvements and 82% less energy consumption. 100% Open Source https://t.co/joV5AY4ukN
Microsoft dévoile BitNet b1.58 2B4T : une IA ultra-efficace qui fonctionne sur CPU 👇https://t.co/BC73pgVuvS
Microsoft Research has introduced '1-bit,' a compact language model designed to operate efficiently on standard CPUs. The technology, also referred to as BitNet b1.58 2B4T, is an open-source framework called bitnet.cpp that enables fast inference of large language models with up to 100 billion parameters on local x86 CPU devices. This approach offers up to six times faster processing speeds and reduces energy consumption by 82% compared to traditional methods. Microsoft’s innovation positions its lightweight AI model as a competitive alternative to existing solutions like DeepSeek and OpenAI, emphasizing accessibility and efficiency without requiring specialized hardware.