Elon Musk Forecasts Need for 100,000 Nvidia H100 GPUs to Train Grok 3 Model

More From Author

See more articles

How Much is Microsoft Net Worth in 2025?

Microsoft Net Worth in 2025 - All You Need to Know Microsoft, a stalwart in the technology industry,...

Know Everything About Mario Bros Net Worth in June...

Mario and Luigi are two brothers who became well-known after making their first appearance in the Donkey...

Mr. Beast Net Worth in 2025 and how he...

Mr. Beast is among the most well-known YouTube stars. He is now one of the highest-paid YouTubers and...

Elon Musk, the CEO of Tesla and the founder of xAI, offered daring prognostications regarding the evolution of artificial general intelligence (AGI) and deliberated on the obstacles confronting the AI sector. He foresees AGI potentially surpassing human intelligence by as early as next year or by 2026, although achieving this milestone entails employing an extensive array of processors, which in turn demands substantial electricity, as per Reuters’ report.

Elon Musk

Elon Musk About Training Grok 3 Model

Elon Musk’s enterprise, xAI, presently engages in training the second iteration of its Grok large language model and aims to conclude its forthcoming training phase by May. The training process for Grok’s version 2 model necessitated up to 20,000 Nvidia H100 GPUs, with Musk envisioning future versions demanding even more resources, such as around 100,000 Nvidia H100 chips for the Grok 3 model’s training.

Elon Musk emphasized that the advancement of AI technology faces two primary impediments: shortages in the supply of advanced processors — like Nvidia’s H100, given the difficulty of swiftly acquiring 100,000 units — and the availability of electricity.

It should be noted that the 100,000 GPUs consume about 700 W when the H100 GPU from Nvidia is fully operational; hence, 70000 shall require a total megawatts of 70. Inclusive of the servers and cooling mechanisms, it is safe to assume that a data center with 100,000 Nvidia H100 should consume about 100 mob, which is the same with a small town.

Elon Musk emphasized that the shortage of compute GPU has been a massive limitation, but the most limiting factor in the next one to two years will be access to electricity. This double availability constraint makes it exceedingly challenging to expand AI to meet the growing computational complexities.

In the face of such adversity, developments in computer and storage architecture will allow for increasingly larger language models to be trained in the coming years. At GTC 2024, Nvidia finally showcased its Blackwell B200, a G.P.U. architecture and its accompanying platform that is designed to scale to language models in the trillions of parameters, which will be critical to AGI creation.

In fact, Musk believes that an artificial general intelligence that is already markedly smarter than the finest human will be created in the next one to two years. “If you say AG.I. is smarter than the smartest human, it’s probably next year or within two years,” Musk said on a podcast, making us more aware of our Terminator-era state and more hopeful for a benign Skynet-less AGI future.

FAQs

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.

━ Related News

Featured

━ Latest News

Featured