AMD Powers Oracle Cloud’s Latest AI Compute Supercluster with MI300X Accelerators

More From Author

See more articles

iPhone 17 Pro Launch Imminent: 14 Revolutionary Features Set...

Apple's most anticipated release of 2025 is just around the corner. The iPhone 17 Pro and iPhone...

Barcelona Transfer Shock: Nico Williams Snub Forces Move for...

Barcelona's summer transfer plans have been thrown into chaos after primary target Nico Williams signed a stunning...

Get Free Xbox Redeem Codes Now! Don’t Miss Out...

Are you ready to elevate your gaming experience without spending a dime? If you’re an Xbox enthusiast,...

AMD (NASDAQ: AMD) announced today that Oracle Cloud Infrastructure (OCI) has selected AMD Instinct™ MI300X accelerators, paired with ROCm™ open software, to drive its newest OCI Compute Supercluster instance, the BM.GPU.MI300X.8.

AMD Instinct MI300X Accelerators Available on Oracle Cloud Infrastructure for Demanding AI Applications

Designed for demanding AI tasks like large language model (LLM) training and inference, the supercluster supports up to 16,384 GPUs in a single cluster, using OCI’s ultrafast network fabric technology. This powerful infrastructure is already enabling companies like Fireworks AI to scale their AI operations.

AMD Instinct MI300X and ROCm open software continue to prove themselves as leading solutions for AI workloads,” said Andrew Dieckmann, corporate VP and general manager of AMD’s Data Center GPU Business. As AI adoption grows, this combination provides OCI customers with unmatched performance and flexibility.

Donald Lu, senior VP of software development at Oracle Cloud Infrastructure, emphasized that the MI300X accelerators eliminate the overhead of virtualized computing, offering a cost-effective solution for companies looking to accelerate AI workloads.

Optimized for AI Training and Inference

The AMD Instinct MI300X has been extensively tested, proving its capability to handle latency-sensitive AI inference and large-batch training workloads. Companies like Fireworks AI, which deploys generative AI systems, are already leveraging the MI300X’s memory capacity and performance to scale their models and serve diverse industries.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.

━ Related News

Featured

━ Latest News

Featured