Nous Research has made significant advancements in decentralized artificial intelligence by pre-training a 15 billion parameter language model using its innovative DisTrO approach. This new methodology allows for distributed training over the internet, enabling users globally to connect their computers for collaborative training. The DisTrO technology reportedly enhances communication efficiency between GPUs by a factor of 10,000, marking a substantial leap in the field. The growing interest in decentralized training is seen as a pivotal development, potentially reducing the dominance of large tech companies in AI. Enthusiasts and experts within the AI community have expressed excitement about these advancements, highlighting the importance of decentralized training as a key innovation for the future.
it would appear ai decentralization is accelerating. this is early work, and we'll see how it in particular pans out, but I'm really happy to see such strong, concrete steps from Nous. it's also amazing that they're already training a 15B param model with it. bullish https://t.co/SsQqYzbUiw
This is just so beautiful. People worldwide connect their computers to train a 15B parameter language model over the internet and live-stream the process. And Nous Research's DisTrO tech reduces the communication between GPUs by 10,000x https://t.co/XooWvIKjmH
Distributed training over the internet! https://t.co/qZh5dGrsuy