
xAI has made a significant contribution to the AI community by releasing the base model weights and network architecture of Grok-1, a 314 billion parameter Mixture-of-Experts (MoE) model, under the Apache 2.0 license. This release marks a pivotal moment in AI development, as it democratizes access to advanced AI technology. Grok-1, trained from scratch by xAI, is a base model not fine-tuned for any specific task, making it a versatile tool for researchers and developers. The model, which includes a selective activation mechanism that engages 25% of the weights on a given token, was completed in October 2023. It is available for download via a magnet link provided in the bio of the Grok Twitter account. The open-source community has welcomed this move, with many highlighting its potential to accelerate innovation and understanding of AI systems. However, some critics argue that without the training data, the model's utility might be limited. Despite this, the release of Grok-1 is seen as a step forward in making powerful AI tools more accessible.













































"Grok’s release, then, marks not only a flash point in a battle between companies but also, perhaps, a turning point across the industry. OpenAI’s commitment to secrecy is starting to seem like an anachronism." New from @matteo_wong https://t.co/9qZkAYvvI0
🤖🇺🇸 Musk's xAI unveils Grok, an 'open-source' AI chatbot. But does 'open-sourcing' in AI really exist, or is it merely a marketing ploy? https://t.co/fmNnUDRFBh
Why Elon Musk’s AI company ‘open-sourcing’ Grok matters – and why it doesn’t: https://t.co/M3oAnojcui by TechCrunch #infosec #cybersecurity #technology #news