Home Artificial Intelligence Samsung Electronics develops the industry’s first ‘36GB HBM3E 12H’… “Optimal revolutionary product for AI services”

Samsung Electronics develops the industry’s first ‘36GB HBM3E 12H’… “Optimal revolutionary product for AI services”

1
Samsung Electronics develops the industry’s first ‘36GB HBM3E 12H’… “Optimal revolutionary product for AI services”

Samsung Electronics HBM3E 12H DRAM product image (Photo = Samsung Electronics)

Samsung Electronics succeeded in developing the industry's first 36 gigabyte (GB) HBM3 12-layer stacked DRAM. Through this, the corporate plans to take the lead within the high-capacity HBM market.

Samsung Electronics (CEOs Jong-hee Han and Kyeong-hyeon Gye) announced on the twenty seventh, “We now have achieved ‘36GB fifth generation HBM3E 12H,’ the industry’s largest capability, by stacking 24GB DRAM chips as much as 12 layers using through-silicon electrode (TSV) technology.”

TSV is a technology that stacks DRAM chips with hundreds of microscopic holes vertically and connects the chips with electrodes.

HBM3E 12H is a greater than 50% improvement over its predecessor, the 4th generation HBM3 8H, and provides a bandwidth of as much as 1280GB per second and a capability of 36GB.

Even though it is a 12-layer laminated product, ‘Advanced TC NCF’ technology was applied to realize the identical height as an 8-layer product. Advanced TC NCF is an advantageous technology for high-level stacking as it could possibly minimize the ‘bending phenomenon’ that may occur as chip thickness becomes thinner.

Samsung Electronics continued to lower the NCF material thickness, achieving the industry's smallest chip spacing of seven micrometers (um). Through this, it was revealed that a vertical integration that was greater than 20% improved in comparison with HBM3 8H was achieved.

HBM3E 12H, which Samsung Electronics successfully developed, is predicted to be one of the best solution for various firms using AI platforms in a situation where data processing volume is rapidly increasing as a consequence of the advancement of artificial intelligence (AI) services.

Particularly, when using this product with increased performance and capability, GPU usage is reduced, which allows firms to cut back total cost of ownership (TCO), and the power to administer resources flexibly can be cited as a bonus.

For instance, applying HBM3E 12H to a server system can improve AI learning and training speed by a median of 34% in comparison with installing HBM3 8H, and within the case of inference, as much as 11.5 times more AI user services are expected to be possible.

Samsung Electronics has begun providing samples of the 12-layer HBM3E to customers and announced that it can begin mass production in the primary half of this yr.

“Samsung Electronics is working to develop revolutionary products that meet the needs of high-capacity solutions for patrons providing AI services,” said Bae Yong-cheol, vice chairman of product planning for Samsung Electronics’ memory division. “We’ll lead and pioneer the HBM market,” he said.

Meanwhile, SK Hynix, the present HBM leader, developed the HBM3E 8-layer product in August of last yr and announced plans for mass production in the primary half of this yr. Micron, which ranks third, also announced plans to mass produce HBM3E, so competition amongst firms is predicted to accentuate in the longer term.

Reporter Park Chan cpark@aitimes.com

1 COMMENT

  1. I have read some excellent stuff here Definitely value bookmarking for revisiting I wonder how much effort you put to make the sort of excellent informative website

LEAVE A REPLY

Please enter your comment!
Please enter your name here