Saturday, July 27, 2024
HomeAI News & UpdatesSamsung Announced New AI Memory Chip With The "Highest-Capacity To Date,"

Samsung Announced New AI Memory Chip With The “Highest-Capacity To Date,”

On Tuesday, Samsung Electronics announced the “highest-capacity to date” in the industry for high-bandwidth memory chips with its new development.

The HBM3E 12H “raises both performance and capacity by more than 50%,” according to the South Korean chip manufacturer.

The new HBM3E 12H product has been intended to meet the growing demand for larger capacity HBM from the industry’s AI service providers, according to Yongcheol Bae, Samsung Electronics executive vice president of memory product planning.

Samsung becomes first to introduce 12-stack HBM3E amid high demand from AI | ZDNET

“This new memory solution forms part of our drive toward developing core technologies for high-stack HBM and providing technological leadership for the high-capacity HBM market in the AI era,” Bae stated.

Smartphones and laptops rely on dynamic RAM chips, and Samsung Corporation is the leading manufacturer of these chips in the world. Numerous high-performance memory chips are necessary for generative AI models like OpenAI’s ChatGPT. Generative AI models can now recall user preferences and conversational subtleties thanks to these chips, allowing them to mimic human speech.

Chipmakers are being propelled by the AI boom. Thousands of graphics processing units, which are utilized to operate and train ChatGPT, contributed to a 265% increase in the fourth fiscal quarter income for U.S. chip creator Nvidia.

On a conference call with experts, Jensen Huang, Nvidia’s CEO, expressed concern that the firm could struggle to sustain its current rate of growth or sales throughout the year.

More than just a chip: Q&A wth Nvidia CEO Jensen Huang

The HBM3E 12H is anticipated to be the ideal memory option for the next systems with exponentially increasing AI applications. Samsung Electronics stated that its product will help clients manage resources more easily and minimize the overall cost associated with ownership of data centres due to its increased performance and capacity.

Samsung has announced that it has begun sending out samples of its HBM3E 12H chip to interested parties, with plans to begin mass-producing the chips in the initial half of 2024. An expert predicts a “very strong” increase in Samsung’s profits.

Executive director SK Kim of Daiwa Securities told CNBC, “I assume the news will be positive for Samsung’s share price.” said SK Kim.

Last year, Samsung lagged beyond SK Hynix in Nvidia’s HBM3 competition. Additionally, yesterday, Micron stated that 24GB 8L HBM3E will be mass-produced. Kim speculated that Nvidia’s higher density (12L) HBM3E product (36GB) will solidify its position as the market leader.

A report from the Korea Economic Daily reported unidentified industry insiders as saying that in September, Samsung had an agreement to provide Nvidia with their high-bandwidth memory three processors.

The survey also stated that the high-performance memory chip industry was being led by SK Hynix, the second-biggest memory chipmaker in South Korea. According to the study, SK Hynix was formerly the only mass-producing company that supplied Nvidia with HBM3 chips.

SK hynix Platinum P41 SSD Review: The Best Around (Updated) | Tom's Hardware

To conform to the present standards for HBM packages, Samsung claims that its HBM3E 12H, which features a 12-layer stack, is just as tall as 8-layer goods thanks to their use of superior thermal compression non-conductive film. The end product is a smaller, more powerful semiconductor that doesn’t sacrifice processing capability.

According to Samsung, they have gotten the lowest gap between chips in the industry “at seven micrometres (µm)” and have also eliminated gaps between layers, all while continuing to reduce the thickness of their NCF material. “When compared to its HBM3 8H product, these efforts result in an enhanced vertical density of over 20%.”

Editorial Staff
Editorial Staff
Editorial Staff at AI Surge is a dedicated team of experts led by Paul Robins, boasting a combined experience of over 7 years in Computer Science, AI, emerging technologies, and online publishing. Our commitment is to bring you authoritative insights into the forefront of artificial intelligence.
RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

Recent Comments