Elon Musk has reportedly purchased thousands of GPU for its new AI project twitter

More From Author

See more articles

India CEOs Dominating Global Companies: From Microsoft to Chanel...

India-origin executives are redefining global leadership, with 226 leaders of Indian origin now heading the world's most...

Full Form IT in 2025: What are the best...

Full Form IT: The Best Guide IT stands for information technology in its entire form. Computers are utilised...

Full Form of ITI: What does it mean in...

Full Form of ITI: Here's everything to know about ITI Full Form of ITI: ITI is an abbreviation...

Despite advocating for an industry-wide moratorium on AI training, Elon Musk is said to have launched a major artificial intelligence project within Twitter. According to Business Insider, the company has already purchased approximately 10,000 GPUs and hired DeepMind AI talent for the project, which involves a large language model (LLM).

Elon Musk’s AI project is still in its early stages.

According to report, a significant amount of additional computational power indicates his dedication to advancing the project. Meanwhile, the generative AI’s exact purpose is unknown, but potential applications include improving search functionality and creating targeted advertising content.

Elon Musk
credit: tomshardware

At this time, it is unknown what specific hardware Twitter purchased. Despite Twitter’s ongoing financial problems, which Elon Musk describes as a ‘unstable financial situation,’ Twitter has reportedly spent tens of millions of dollars on these compute GPUs. These GPUs will most likely be used in one of Twitter’s two remaining data centres, with Atlanta being the most likely location. Interestingly, Elon Musk shut down Twitter’s primary datacenter in Sacramento in late December, reducing the company’s compute capabilities.

Twitter is hiring additional engineers in addition to purchasing GPU hardware for its generative AI project. Earlier this year, the company hired Igor Babuschkin and Manuel Kroiss from Alphabet’s AI research subsidiary DeepMind. Since at least February, Elon Musk has been actively seeking AI talent to compete with OpenAI’s ChatGPT.

OpenAI trained its ChatGPT bot on Nvidia’s A100 GPUs and continues to run it on these machines. Nvidia has now released the A100’s successor, the H100 compute GPUs, which are several times faster at roughly the same power. Twitter’s AI project will most likely use Nvidia’s Hopper H100 or similar hardware, though this is just speculation. Given that the company has yet to decide what its AI project will be used for, estimating how many Hopper GPUs it will require is difficult.

Also Read:

Source

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.

━ Related News

Featured

━ Latest News

Featured