Elon Musk’s xAI has open sourced the base code for its Grok AI model, but no training code. The company describes it on GitHub as a “314 billion parameter mixed expert model.”
In a blog post, xAI said the model is not tailored for specific applications, such as those used in conversations. The company noted that Grok-1 is trained on a “custom” stack, without specifying details.
Last week, Musk mentioned at X that xAI plans to open source the Grok model this week. The company launched Grok last year in the form of a chatbot, accessible to Premium+ users of the X social network.
Many well-known companies have open sourced some of their AI models, including Meta’s LLaMa, Mistral, Falcon, and AI2. Last February, Google also launched two new open models: Gemma2B and Gemma7B.
Some AI-based tool makers are already talking about using Grok in their solutions. Perplexity CEO Arvind Srinivas posted on X that the company will fine-tune Grok for interactive search and make it available to Pro users.
Musk has been engaged in a legal battle with OpenAI, suing the company earlier this month for “betrayal” of his non-profit AI goals. Since then, he’s called out OpenAI and Sam Altman at X. majority times.