
Nous Research has released a preliminary report on DisTrO (Distributed Training Over-the-Internet), a new system that significantly reduces communication requirements for training AI models. The DisTrO method enables decentralized training by lowering bandwidth needs by 1,000 to 10,000 times, including an 857x reduction in bandwidth requirements, with potential for up to 3000x in pre-training and 10,000x in fine-tuning. This advancement addresses the latency problem in decentralized training, allowing for efficient, low-latency training of large neural networks over slow internet connections. The open-source nature of this project is expected to foster collaboration and innovation in AI, potentially democratizing access to powerful AI model training.
‘This could change everything!’ Nous Research unveils new tool to train powerful AI models with 10,000x efficiency: Ultimately, the DisTrO method could open the door to many more people being able to train massively powerful AI models. https://t.co/7BPy96ySxm #AI #Business
'This could change everything!' Nous Research unveils new tool to train powerful AI models with 10,000x efficiency https://t.co/71NfCf4F6l
We might be able to train large AI models in a fraction of the time—and cost. @NousResearch has developed an AI model training optimizer, DisTrO, which uses a decentralized system to distribute the load across multiple networks. Learn the details ⬇️ https://t.co/QwHTjGPCqt…