TechnoSports Media Group
  • Home
  • Technology
  • Smartphones
  • Deal
  • Sports
  • Reviews
  • Gaming
  • Entertainment
No Result
View All Result
  • Home
  • Technology
  • Smartphones
  • Deal
  • Sports
  • Reviews
  • Gaming
  • Entertainment
No Result
View All Result
TechnoSports Media Group
No Result
View All Result

HBM3 memory will be the main driving force in next-gen GPUs

Nivedita Bangari by Nivedita Bangari
August 3, 2023
in News, Technology
0

According to TrendForce, next-generation HBM3 and HBM3e memory will dominate the AI GPUs sector, especially given the large surge in desire from companies to incorporate DRAM. The current NVIDIA A100 and H100 AI GPUs are powered by HBM2e and HBM3 memory, which launched in 2018 and 2020, respectively. Several manufacturers, including Micron, SK Hynix, and Samsung, are swiftly establishing mass production facilities for new and faster HBM3 memory, and it won’t be long before it becomes the new standard.

Many individuals are unaware of a general aspect concerning the next-gen memory. HBM3 will be released in a variety of flavours, according to TrendForce. The lower-end HBM3 is expected to run at 5.6 to 6.4 Gbps, with higher variants exceeding 8 Gbps.

RelatedPosts

Apple Eyes $140B Holiday Quarter as 50th Year Begins

The BEST Google Play Redeem Codes as of November 2025

Farhan Thakur Wins Red Bull Tetris India, Heads to Dubai

HBM3
credit: wccftech

Future AI GPUs, such as the AMD MI300 Instinct GPUs and NVIDIA’s H100, are slated to use the next-generation HBM3 process, which SK Hynix has an advantage over because it has already entered manufacturing stages and has received a sample request from NVIDIA.

Micron has also just disclosed plans for their future HBM4 memory design, but that isn’t due until 2026, therefore the planned NVIDIA Blackwell GPUs dubbed “GB100” will most likely use the faster variations when they arrive next between 2024-2025. The memory, which uses 5th generation process technology (10nm), is projected to enter mass production in the first half of 2024, with both Samsung and SK Hynix ramping up.

Competitors such as Samsung and Micron are pushing the accelerator, according to multiple sources. According to reports, Samsung offered that NVIDIA handle both wafer and memory acquisition through its subsidiaries. Simultaneously, Micron has reportedly partnered with TSMC to supply memory for NVIDIA’s AI GPUs.

Also Read:

  • My Jio Recharge Plans as of 3rd August 2023: Here are the top trending plans from Jio
  • How To Enable Flags on Google Chrome in 2023?

Source

Tags: AMDGPUsHBM3 memoryNVIDIA
Previous Post

Grab the Independence Day Delight: Nothing Phone 2 and Ear 2 at Unbeatable Discounts!

Next Post

Meesho’s AI and Data Science Revolution: A Milestone in Combating Counterfeit Listings and Ensuring Trust in E-commerce

Related Posts

Apple

Apple Eyes $140B Holiday Quarter as 50th Year Begins

November 4, 2025
FAQ

The BEST Google Play Redeem Codes as of November 2025

November 4, 2025
News

Farhan Thakur Wins Red Bull Tetris India, Heads to Dubai

November 4, 2025
Entertainment

Baseer Ali and Nehal Chudasama’s Romance Sparks Controversy in Bigg Boss 19

November 4, 2025
Entertainment

From Ludhiana to Prime Video: Jessica Khurana’s Journey as Co-Writer of “Two Much with Kajol and Twinkle”

November 4, 2025
Garena Free Fire MAX Redeem Codes
FAQ

Garena Free Fire MAX Redeem Codes for November 4 Your Daily Dose of Epic Rewards!

November 4, 2025
Next Post
Meesho

Meesho's AI and Data Science Revolution: A Milestone in Combating Counterfeit Listings and Ensuring Trust in E-commerce

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

  • Buy JNews
  • Support Forum
  • Pre-sale Question
  • Contact Us
Call us: +1 234 JEG THEME
No Result
View All Result
  • Home
  • Technology
  • Smartphones
  • Deal
  • Sports
  • Reviews
  • Gaming
  • Entertainment

© 2025 TechnoSports Media Group - The Ultimate News Destination