Loading
Yanuki
ARTICLE DETAIL
NVIDIA Taps SK Hynix and Samsung for HBM4 Memory in "Vera Rubin" AI Systems | FuboTV Drops PayPal: What Payment Changes Could Mean for You | Tesla Robotaxi Business: Key Numbers and Stats | Tencent QClaw and WorkBuddy: AI Agents for QQ, WeChat, and Enterprise Efficiency | Tencent Internally Tests QClaw for Dual Access to WeChat & QQ | OpenAI Hardware Leader Resigns Over Pentagon AI Deal | Apple Releases OS 26.3.1: Enhanced Studio Display Support and Bug Fixes | Hangzhou's $3.7B AI GPU Deal: A Multi-Vendor Chip Strategy | Tech Firms Respond to Middle East Conflict: Office Closures and Data Center Disruptions | NVIDIA Taps SK Hynix and Samsung for HBM4 Memory in "Vera Rubin" AI Systems | FuboTV Drops PayPal: What Payment Changes Could Mean for You | Tesla Robotaxi Business: Key Numbers and Stats | Tencent QClaw and WorkBuddy: AI Agents for QQ, WeChat, and Enterprise Efficiency | Tencent Internally Tests QClaw for Dual Access to WeChat & QQ | OpenAI Hardware Leader Resigns Over Pentagon AI Deal | Apple Releases OS 26.3.1: Enhanced Studio Display Support and Bug Fixes | Hangzhou's $3.7B AI GPU Deal: A Multi-Vendor Chip Strategy | Tech Firms Respond to Middle East Conflict: Office Closures and Data Center Disruptions

Tech / Memory

NVIDIA Taps SK Hynix and Samsung for HBM4 Memory in "Vera Rubin" AI Systems

NVIDIA is set to utilize SK Hynix and Samsung HBM4 memory in its upcoming "Vera Rubin" AI systems, scheduled to ship in late summer. This decision comes as NVIDIA pushes the boundaries of memory bandwidth for AI, leaving Micron to supply LP...

Micron stock dips as analyst slashes Nvidia HBM4 supply forecast to zero
Share
X LinkedIn

nvidia stock price
NVIDIA Taps SK Hynix and Samsung for HBM4 Memory in "Vera Rubin" AI Systems Image via Investing.com

Key Insights

  • SK Hynix is slated to provide approximately 70% of the HBM4 supply for NVIDIA's VR200 NVL72 systems, with Samsung covering the remaining 30%.
  • Micron will supply LPDDR5X memory for the "Vera" CPUs, potentially up to 1.5 TB per CPU, compensating for their absence in the HBM4 supply chain.
  • NVIDIA's VR200 NVL72 system bandwidth has increased by nearly 70%, reaching 22 TB/s, driven by aggressive memory specification scaling.
  • NVIDIA is reportedly seeking faster HBM4 deliveries from Samsung amidst a global memory crunch.

In-Depth Analysis

NVIDIA's "Vera Rubin" AI systems, particularly the VR200 NVL72, represent a significant leap in computational power for next-generation AI models. The selection of SK Hynix and Samsung for HBM4 memory underscores their capabilities in meeting NVIDIA's stringent requirements for memory bandwidth. Micron, while not included in the HBM4 supply, remains a key player by providing LPDDR5X memory for the "Vera" CPUs, showcasing a diversified approach to memory solutions within the same ecosystem.

The increase in system bandwidth from an initial target of 13 TB/s to the current 22 TB/s highlights the rapid advancements in memory technology driven by AI demands. This also reflects NVIDIA's strategic push to optimize memory specifications, ensuring maximum performance for its AI platforms.

*Actionable Takeaway:* Companies involved in AI development should closely monitor the evolving memory landscape to leverage the latest advancements in HBM and LPDDR technologies.

Read source article

FAQ

Why did NVIDIA choose SK Hynix and Samsung over Micron for HBM4?

Reportedly, Micron did not qualify for the significant system upgrade NVIDIA performed for VR200 NVL72, which demanded aggressive memory specification scaling.

What is the role of LPDDR5X memory in NVIDIA's Vera Rubin systems?

Micron will supply LPDDR5X memory for the "Vera" CPUs, which can be equipped with up to 1.5 TB of LPDDR5X, making up for the lost share with the HBM4.

What is HBM4?

HBM4 is the sixth-generation of High Bandwidth Memory, designed to provide significantly faster data access for demanding applications like AI and high-performance computing.

Takeaways

  • NVIDIA's selection of SK Hynix and Samsung for HBM4 in its Vera Rubin systems highlights the critical role of memory technology in AI advancements.
  • Micron's focus on LPDDR5X for Vera CPUs demonstrates a strategic diversification in the memory market.
  • The increasing demands for memory bandwidth in AI applications are driving rapid innovation and competition among memory manufacturers.

Discussion

Do you think this trend of increasing memory bandwidth demands will continue to drive innovation in the memory market? Share your thoughts below!

Share this article with others who need to stay ahead of this trend!

Sources

Disclaimer

This article was compiled by Yanuki using publicly available data and trending information. The content may summarize or reference third-party sources that have not been independently verified. While we aim to provide timely and accurate insights, the information presented may be incomplete or outdated.

All content is provided for general informational purposes only and does not constitute financial, legal, or professional advice. Yanuki makes no representations or warranties regarding the reliability or completeness of the information.

This article may include links to external sources for further context. These links are provided for convenience only and do not imply endorsement.

Always do your own research (DYOR) before making any decisions based on the information presented.