Musk Confirms Grok 2 Coming in August, Grok 3 by End of the Year
Elon’s xAI startup is close to launching its second-generation AI model
Elon Musk's xAI startup is set to launch the next generation of its Grok language model in August, with plans for further advancements as it aims to compete with OpenAI.
Musk confirmed in a post on X (Twitter) that the large language model, Grok 2, will be released in a few weeks. Responding to a user’s question about training data, he described the model as a "giant improvement."
Both Musk and xAI have released little information on the model. In March, Musk said Grok 2 would exceed current generation AI models “on all metrics.”
Grok 2’s launch will be followed by Grok 3 which Musk said would drop around the end of the year.
Musk said xAI has been training Grok 3, a model on par with “or beyond” GPT-5, the as-yet unreleased OpenAI touted to be the next great leap in language models.
Grok 3 requires a huge cluster of 100,000 Nvidia H100 GPUs, with Musk saying the model will be “really something special.”
Musk said in a recent X Spaces conversation that as xAI scales with Grok 3’s massive size, it’s running into issues with access to data. He said his team is looking at adding synthetic data or data from videos to Grok 3’s training corpus.
The first version of Grok launched in November 2023 shortly after Musk founded xAI to compete with OpenAI. The startup has since raised $6 billion and is valued at $24 billion.
In April xAI launched Grok 1.5 with improved reasoning capabilities and the ability to handle larger text inputs thanks to an expanded context length.
XAI’s Grok line of models are more rebellious and satirical than those built by OpenAI, designed to generate responses with fewer filters.
Following the release of 1.5, Musk and the xAI team have been working on Grok 2, relying on cloud services from Oracle and several of X’s data centers as the startup doesn’t have its own dedicated training infrastructure.
Oracle founder and CTO Larry Ellison said in the company’s Q2 earnings call that xAI “want(s) a lot more GPUs than we gave them.”
“We gave them quite a few, but they wanted more, and we're in the process of getting them more,” Ellison said.
To expand xAI’s training efforts, Musk has brought in Nvidia, Dell and Supermicro to build xAI a “gigafactory of compute” to build what he hopes will be the largest supercomputer in the world.
Read more about:
ChatGPT / Generative AIAbout the Author
You May Also Like