Scroll Top

Elon Musk wants to buy 300,000 Nvidia Blackwells to power new xAI ambitions

WHY THIS MATTERS IN BRIEF

As AI models get even bigger and more energy hungry buyers are lining up around the block for Nvidia’s latest ultra-powerful GPUs.

 

Love the Exponential Future? Join our XPotential Community, future proof yourself with courses from XPotential University, read about exponential tech and trendsconnect, watch a keynote, or browse my blog.

Artificial Intelligence (AI) GPU development is running at breakneck speeds, and Elon Musk wants to be at the forefront of the revolution. In an X post, the SpaceX and Tesla CEO revealed that he wants to buy 300,000 worth of Nvidia’s latest Blackwell B200 GPUs by next summer. The new GPUs will upgrade X’s existing AI GPU cluster, which currently consists of 100,000 previous-generation H100 GPUs.

 

RELATED
China wants to shape and lead global AI standards

 

100,000 H100 GPUs is already an enormous amount of computing power, but Elon states that given the pace of AI GPU development, it’s not worth keeping around X’s massive array of H100 GPUs for long, mainly due to its energy consumption of 1 Gigawatt – which is huge!

X uses the massive array of AI GPUs for Grok, an AI bot. The AI was developed on a homebrewed language dubbed Grok-1 that is geared to provide less straightforward answers as well as witty and comedic answers compared to ChatGPT and Gemini. Basically, it is trying to take the “robot” aspect out of the AI bot. The new AI bot is available to X users right now; however, you’ll need to be an X Premium user to gain access to the AI bot.

Musk’s logic has merit. The AI GPU development race is one of the most heated races we’ve seen in years in the technological industry, rivalling the CPU development wars we had in the 1990s and 2000s. Nvidia’s new Blackwell B200 is a massive upgrade over the H100, offering four times the training performance and 30 times the inference performance.

 

RELATED
Black box AI's learn to express themselves so researchers can read their minds

 

Technically, the B200 does consume more power. However, the B200’s colossal performance improvements mean the chip runs significantly more efficiently than the H100. In Musk’s case, trading 100,000 H100s for three times more GPUs that consume even more power is still a net win due to the GPUs’ incredible amount of additional AI performance.

It’ll be interesting to see when Musk does get his hands on all 300,000 B200 GPUs. If Nvidia’s H100 has taught us anything, it’s that its AI GPU demand always outstrips actual supply. We will probably see a repeat of 2023, when all the big AI customers, including X, Meta, Google, Microsoft, and others, are fighting to grab as many B200s as Nvidia can pump out for at least the next several months.

Related Posts

Leave a comment

You have Successfully Subscribed!

Pin It on Pinterest

Share This