Musk xAI Plans World's Largest Supercomputer In Memphis

Elon Musk's xAI plans to build a "gigafactory of compute" in Memphis, aiming to house the world's most powerful supercomputer

Ben Wodecki, Jr. Editor

June 10, 2024

3 Min Read
A man with his hands clasped in front of his face
Apu Gomes/Getty Images

Elon Musk’s xAI is planning on building the world’s largest supercomputer in Memphis, Tennessee to support its AI model training and inference efforts.

XAI currently relies on data centers from X (formerly Twitter) and cloud services from Oracle to train and improve its Grok foundation model.

The proposed project will see the construction of a dedicated “gigafactory of compute” that would enable the startup to have its own AI infrastructure center.

The project is still pending approval by the Memphis Shelby County Economic Development Growth Engine and other local authorities. 

“Memphis is a city of innovators, so it’s no surprise that it feels like home to those looking to change the world,” said Paul Young, the mayor of Memphis. “We had an ideal site, ripe for investment. And we had the power of our people who created new and innovative processes to keep up with the pace required to land this transformational project.”

The project would be the largest capital investment by a new-to-market company in the city’s history.

“The good-paying jobs, the cachet of hosting the world’s most powerful supercomputer and the significant additional revenues for Memphis Light, Gas and Water (MLGW) this project brings will help support our reliability and grid modernization efforts,” said Doug McGowen, MLGW’s CEO. “These are all wins for our community.”

Related:Elon Musk xAI Raises $6B, Valuing Company at $24B

XAI recently raised $6 billion to fuel its AI efforts, with the year-old company now valued at $24 billion. The startup said part of its new funds would be spent on building advanced infrastructure.

Musk founded the startup to rival OpenAI, creating Grok, an AI designed to answer questions with wit and a rebellious streak.

The model was built in just four months in part due to Project IDE, the startup’s development environment for prompt engineering, enabling engineers to refine the model’s outputs.

Another Musk company is building a powerful supercomputer.

For years, Tesla has been covertly working on Dojo, a supercomputing project first showcased in 2021, providing sporadic updates on its construction. The first cluster came online last year, with neural networks commencing training in July 2023.

Tesla has been snapping up hardware to power its AI training efforts. During its April first-quarter earnings call, Musk said the company planned to acquire around 85,000 H100 GPUs from Nvidia.

However, some shipments of the Nvidia chips are being diverted from Tesla to xAI.

According to emails seen by CNBC, some 12,000 shipped H100 GPUs originally slated for Tesla have been diverted to X, which xAI has been using to further refine Grok.

Related:Elon Musk's AI Startup XAI Opens Grok for Business

The title of most powerful supercomputer in the world currently belongs to Frontier after it held onto first place in the most recent Top500 list in May. 

XAI and Tesla’s supercomputing projects face tough competition for the title from powerful supercomputers including Aurora, which came second on the list despite not yet being finished.

Read more about:

ChatGPT / Generative AI

About the Author

Ben Wodecki

Jr. Editor

Ben Wodecki is the Jr. Editor of AI Business, covering a wide range of AI content. Ben joined the team in March 2021 as assistant editor and was promoted to Jr. Editor. He has written for The New Statesman, Intellectual Property Magazine, and The Telegraph India, among others. He holds an MSc in Digital Journalism from Middlesex University.

Keep up with the ever-evolving AI landscape
Unlock exclusive AI content by subscribing to our newsletter!!

You May Also Like