During its Memory Tech Day 2023, Samsung unveiled Shinebolt, its next-generation HBM3E DRAM, building on the company’s experience in commercializing the sector’s first HBM2 and developing the HBM market for high-performance computing (HPC) in 2016. Next-generation AI applications will be powered by Samsung’s Shinebolt, which will lower total cost of ownership (TCO) and hasten inference and training of AI models in the data center.
The HBM3E can transfer data at rates greater than 1.2 terabytes per second (TBps) thanks to its remarkable 9.8 gigabits per second (Gbps) per pin performance. Samsung has enhanced its non-conductive film (NCF) technology to remove gaps between chip layers and boost thermal conductivity in order to enable higher-layer stacks and improve thermal characteristics.
Samsung is sending its Shinebolt samples to its clients, and its 8H and 12H HBM3 devices are currently in mass manufacturing
Other products shown at the event include the 32Gb DDR5 DRAM with the greatest capacity in the industry, the first 32Gbps GDDR7 in the market, and the petabyte-scale PBSSD, which significantly improves server application storage capabilities.
Samsung claims that GDDR7 memory will perform 40% better and use 20% less power than the 24 Gbps GDDR6 DRAM currently available, which has a 16 Gb die capacity. A 384-bit bus interface solution will be used to achieve up to 1.5 TB/s of bandwidth with the first devices, which will be rated at transfer speeds of up to 32 Gbps, a 33% improvement over GDDR6 memory.
Additionally, GDDR7 memory will be 20% more efficient, which is fantastic given how much power memory uses on high-end GPUs. According to reports, the Samsung GDDR7 DRAM will have technology that is especially suited for high-speed workloads, as well as a low-operating voltage alternative created for applications with careful power consumption, such laptops.
Also read: