Samsung Electronics marked this Wednesday on its calendar by introducing the high-bandwidth memory chip that also has some AI capabilities to support from the back. The entire chip forms once the new processing-in-memory (PIM) architecture added AI engines inside the company’s HBM2 Aquabolt.
As per Samsung’s remark, the new chip, called HBM-PIM, is able to extract 2x faster AI system performance, however, eat 70% less power compared to HBM2.
The Korean giant explains this breakthrough becomes possible as the designers successfully installed AI Engines inside each memory bank that boos parallel processing and reduces data movements. Samsung now expects its new weapon will perform more efficiently in the data centers, high-performance computing systems, and also in AI-enabled mobile applications.
To apply the HBM-PIM, the consumers need to upgrade any software and/or hardware, even need not to do any changes, as Samsung uses the same HBM interface, same as the older iteration.
Though Samsung already announced the HBM-PIM, it’s under testing still now and for it, the company is working with its customers. All the papers related to the chip will be presented at the virtual International Solid-State Circuits Conference scheduled next week.