Tuesday, March 5, 2024

'Sold out': Samsung archrival sells out of precious HBM cargo but is mum on who the biggest client was — Nvidia and AMD can't get enough high bandwidth memory chips but is there someone else?

SK Hynix, a key Samsung competitor, says it has sold out its entire 2024 production of stacked high-bandwidth memory DRAMs. These chips are crucial for AI processors in data centers, but the company remains tight-lipped about its largest clients.

SK Hynix’s recently appointed vice president, Kitae Kim, who is also the head of HBM sales and marketing, confirmed the news in an interview posted on SK Hynix's website.

“Proactively securing customer purchase volumes and negotiating more favorable conditions for our high-quality products are the basics of semiconductor sales operations,” said Kim. “With excellent products in hand, it’s a matter of speed. Our planned production volume of HBM this year has already sold out. Although 2024 has just begun, we’ve already started preparing for 2025 to stay ahead of the market.”

'Highly sought after'

As EE News Europe points out, the scarcity of HBM3 and HBM3E format chips could potentially hinder the growth of both memory and logic sectors of the semiconductor industry this year.

“HBM is a revolutionary product which has challenged the notion that semiconductor memory is only one part of an overall system. In particular, SK Hynix’s HBM has outstanding competitiveness," Kim added. 

"Our advanced technology is highly sought after by global tech companies,” he added, leaving us wondering who his company's biggest clients might be. Nvidia and AMD are known to be voracious for high bandwidth memory chips, but there are other players in the highly competitive AI market who might be keen to snap up HBM stock to avoid being left high and dry.

Interestingly, while SK Hynix can’t manufacture enough of its current HBM products to keep up with the high demand, its main rivals in this space, Samsung and Micron, are now focused on HBM3E. Micron has begun manufacturing "in volume" its 24GB 8H HBM3E, which will be used in Nvidia's latest H200 Tensor Core GPUs. At the same time, Samsung has begun sampling its 36GB HBM3E 12H to customers, and this might well be the memory used in Nvidia’s B100 B100 Blackwell AI powerhouse which is expected to arrive by the end of this year.

SK Hynix isn’t going to be left behind for long however. It is expected to begin manufacturing its own 36GB HBM3E in the first half of this year, following an upgrade of its Wuxi plant in China.

More from TechRadar Pro



from TechRadar - All the latest technology news https://ift.tt/a9Ocuy7

No comments:

Post a Comment

Elon Musk’s xAI supercomputer gets 150MW power boost despite concerns over grid impact and local power stability

Elon Musk's xAI supercomputer gets power boost amid concerns 150MW approval raises questions about grid reliability in Tennessee Lo...