According to people with knowledge of the situation, Microsoft Corp. is collaborating with Advanced Micro Devices Inc. on the chipmaker’s entry into artificial intelligence processors as part of a multifaceted strategy to obtain more of the highly sought-after parts.
According to the people, who declined to be identified because the conversation is private, the companies are working together to provide a rival to Nvidia Corp., which currently dominates the market for graphics processing units with AI capabilities. According to the sources, the software giant is supporting AMD’s initiatives by offering engineering resources and collaborating with the chipmaker on the Athena processor, a custom-built Microsoft chip for AI workloads.
Microsoft gained about 1% on Thursday, while AMD stock increased by more than 6.5%. Reps for AMD declined to comment. The price of Nvidia fell by 1.9%.
With the explosion of chatbots like ChatGPT and other services based on the technology, there is a larger rush to increase AI processing power, which is in high demand. Microsoft is a leader in cloud computing and an innovator in the application of AI. The company promised to include such features in its entire software lineup and has invested $10 billion in OpenAI, the maker of ChatGPT.
Additionally, the action reflects Microsoft’s growing involvement in the chip sector. Under the direction of former Intel Corp. executive Rani Borkar, the company has been developing a silicon division over the past few years, and the division now employs close to 1,000 people. The Athena artificial intelligence chip, which Microsoft is developing, was covered by The Information last month.
According to one of the people, Microsoft has invested about $2 billion in its chip initiatives, and several hundred of those employees are working on the Athena project. However, the project does not herald a rift with Nvidia. Microsoft plans to continue collaborating closely with the maker of the chips used to train and power AI systems.
Microsoft’s partnership with OpenAI and its own lineup of recently launched AI services are requiring more computing power than the company anticipated when it placed its order for chips and built its data centres. Businesses interested in integrating OpenAI’s ChatGPT service into their own goods or internal programmes have expressed interest, and Microsoft has unveiled a chat-based Bing and new AI-enhanced Office tools.
Older products like GitHub’s code-generation tool are also being updated by it. The expensive and potent processors Nvidia offers are necessary for all of those AI programmes, which are run in Microsoft’s Azure cloud.
It will be difficult to develop a lineup that can compete with Nvidia’s. Customers can quickly upgrade their capabilities by using the company’s integrated software and hardware, which includes servers, networking hardware, chips, and a programming language.
One of the causes of Nvidia’s rise to prominence is this. However, Microsoft is not the only company attempting to create its own AI processors. Amazon, a competitor in the cloud space, bought Annapurna Labs in 2016 and has since created two different AI processors. Google, a subsidiary of Alphabet Inc., also has its own training chip.
Also Read: