SK Hynix Announces Development of HBM3 DRAM: Up To 24 GB Capacities, 12-Hi Stacks & 819 GB/s Bandwidth - Android Tricks 4 All
News Update
Loading...

Tuesday, October 19, 2021

SK Hynix Announces Development of HBM3 DRAM: Up To 24 GB Capacities, 12-Hi Stacks & 819 GB/s Bandwidth

SK Hynix First To Complete HBM3 Development: Up To 24 GB In 12-Hi Stack, 819 GB/s Bandwidth

SK Hynix has announced that it has become the first in the industry to develop the next-generation High Bandwidth Memory standard, HBM3.

SK Hynix First To Complete HBM3 Development: Up To 24 GB In 12-Hi Stack, 819 GB/s Bandwidth

According to SK Hynix, the company has successfully developed its HBM3 DRAM, marking the next generation of High Bandwidth Memory products. The new memory standard will not only improve bandwidth but also increase DRAM capacities by stacking multiple DRAM chips vertically in stacks.

SK Hynix began the development of its HBM3 DRAM starting with the mass production of HBM2E memory in July last year. The company is announcing today that its HBM3 DRAM will be available in two capacities, a 24 GB variant which will be the industry's biggest capacity for the specific DRAM & a 16 GB variant. The 24 GB variant will feature a 12-Hi stack comprised of 2GB DRAM chips while 16 GB variants will utilize an 8-Hi stack. The company also mentions that the height of the DRAM chips has been shrunk to 30 micrometers (μm, 10-6m).

“Since its launch of the world’s first HBM DRAM, SK hynix has succeeded in developing the industry’s first HBM3 after leading the HBM2E market,” said Seon-yong Cha, Executive Vice President in charge of the DRAM development. “We will continue our efforts to solidify our leadership in the premium memory market and help boost the values of our customers by providing products that are in line with the ESG management standards.”

via SK Hynix

As for performance, the SK Hynix HBM3 DRAM is expected to deliver 819 GB/s bandwidth per stack. This is an improvement of 78% over SK Hynix's HBM2E DRAM which delivers 460 GB/s of bandwidth. A GPU like the NVIDIA A100 which features 6 HBM2E stacks will be able to deliver up to 5 TB/s bandwidth versus 2.0 TB/s bandwidth that it currently delivers using the existing DRAM standards. The memory capacity using the 24 GB DRAM dies should also theoretically reach up to 120 GB (5 out of 6 dies enabled due to yields) and 144 GB with the whole die stack enabled.

The new memory type is expected to be adopted by high-performance data centers and machine learning platforms in the coming year. Just recently, Synopsys also announced that they are increasing designs for multi-die architectures with HBM3 IP and verification solutions, more on that here.

The post SK Hynix Announces Development of HBM3 DRAM: Up To 24 GB Capacities, 12-Hi Stacks & 819 GB/s Bandwidth by Hassan Mujtaba appeared first on Wccftech.



source https://wccftech.com/sk-hynix-announces-development-of-hbm3-dram-up-to-24-gb-capacities-12-hi-stacks-819-gb-s-bandwidth/
Comments


EmoticonEmoticon

Notification
This is just an example, you can fill it later with your own note.
Done