During its GTC Spring 2023 keynote, Nvidia announced a new dual-GPU product, the H100 NVL. This isn’t going to bring back SLI or multi-GPU gaming, and it’s not going to be one of the best graphics cards for gaming, but it is aimed at the growing AI market. According to Nvidia’s information and images, the H100 NVL (H100 NVLink) will have three NVLink connectors on the top, with the two adjacent cards slotting into separate PCIe slots.
Although the three NVLink options were already available with the H100 PCIe, the H100 NVL introduces some additional changes and will only be offered as a paired card solution. It’s an intriguing change of pace, with a focus on inference performance rather than training, perhaps to accommodate servers that don’t support Nvidia’s SXM option. There are some other obvious differences as well, and the NVLink connections should help fill in the missing bandwidth that NVSwitch provides on the SXM solutions.
Previous H100 SXM and PCIe solutions included 80GB of memory (HBM3 for SXM, HBM2e for PCIe), but the actual package includes six stacks, each with 16GB of memory. It’s unclear whether one stack is completely disabled, or if it’s used for ECC or something else.
What we do know is that the Nvidia H100 NVL will have 94GB per GPU and a total of 188GB HBM3.
We assume the “missing” 2GB per GPU is for ECC or has something to do with yields, though the latter seems a little strange. Power is slightly higher than the H100 PCIe, at 350-400 watts per GPU (configurable), representing a 50W increase. Meanwhile, total performance is effectively double that of the H100 SXM: 134 teraflops of FP64, 1,979 teraflops of TF32, and 7,916 teraflops of FP8 (as well as 7,916 teraops INT8).
Essentially, this appears to be the same core design as the H100 PCIe, which also supports NVLink, but with more GPU cores enabled and 17.5% more memory. Because of the switch to HBM3, the memory bandwidth is also significantly higher than on the H100 PCIe. H100 NVL has 3.9 TB/s per GPU and a total of 7.8 TB/s.
For partner and certified systems, Nvidia only supports 2 to 4 pairs of H100 NVL cards due to the fact that this is a dual-card solution and each card takes up two slots. Though a single H100 PCIe can occasionally be found for about $28,000, that remains to be seen.
- India Semiconductor Market: The country doesn’t have talent for chip manufacturing
- NVIDIA GeForce RTX 4070 Ti and GeForce RTX 4060 Ti Founders Edition appears in Leaked Images