Nous Research has launched Psyche, a decentralized training network designed to democratize artificial intelligence development. Psyche coordinates the training of a 40 billion-parameter open-source large language model (LLM) across distributed, heterogeneous hardware worldwide, bypassing the need for massive centralized infrastructure. The network leverages Solana blockchain technology for secure on-chain coordination and incorporates DisTrO technology to overcome bandwidth bottlenecks. The initiative includes a $500,000 compute donation pool, which was filled in 44 minutes, supporting the largest decentralized pre-training run conducted over the internet. The model is being trained on over 20 trillion tokens using a multi-level attention (MLA) design to enhance long-context efficiency. This effort represents a notable advancement in community-owned global AI compute and open-source AI development.
Web3 promised freedom. But much of it is still running on the same old infrastructure – centralized, surveilled, and controlled. ThreeFold flips the script: peer-to-peer, people-powered, truly free. The future won’t be leased. https://t.co/3Da8cbad3G
Decentralized Futures: Empowering AI, Blockchain, and Web3 Ecosystems through Scalable Compute and Tokenized Innovations https://t.co/52gEZjMnXX
Fine-tuning massive LLMs requires reliability at scale. The Lightning open source stack gives you just that. Check out Democratizing Al: Open-source Scalable LLM Training on GPU-based Supercomputers. ✅ 405B param MoE language model ✅ Fully finetuned on 32,768 GPUs on a https://t.co/MTaB47Idi2