Supply chain challenges risk delaying Nvidia’s Rubin GPUs

Supply chain challenges risk delaying Nvidia’s Rubin GPUs

Nvidia’s next-generation Rubin GPUs could end up shipping later and in lower volumes than expected due to supply chain issues, TrendForce warned on Wednesday.

Industry observers now expect Rubin to account for 22% of Nvidia’s high-end GPU shipments in 2026, down from their previous forecast, which pegged the mix at 29%.

TrendForce cited the time needed to validate the new HBM4 memory used by the chips, challenges with migrating to Nvidia’s faster ConnectX-9 network cards, higher overall system power consumption and more advanced liquid cooling requirements as contributing to the delays.

Shipments of Nvidia’s Hopper GPUs, including H200s destined for the Chinese market, are also expected to be lower than initially forecast due to ongoing geopolitical issues between the United States and China.

In December, the Trump administration said it would allow exceptions to previous U.S. export rules governing sales of high-end AI accelerators to China, with formal U.S. approval in January. The move meant Nvidia could sell its older, but still powerful, H200 accelerators to Chinese customers for the first time. In exchange, Nvidia would only have to pay more than a quarter of the revenue from these sales to Uncle Sam.

Despite this, it took months to convince Beijing to sign the agreement. At GTC last month, CEO Jensen Huang revealed that Nvidia was restarting its manufacturing capacity to once again produce H200s for the Chinese market and that it had purchase orders in progress.

TrendForce now expects Hopper accelerators to account for about 7% of Nvidia’s GPU shipments this year, down from its previous forecast of 10%.

While shipments from Rubin and Hopper are expected to be lower than initial projections, TrendForce says Blackwell GPUs, like the GB300 and B300, will likely fill the gap.

Analysts now predict that Blackwell shipments will account for 71% of Nvidia GPUs sold this year.

Finally, TrendForce is quite optimistic about the demand for Nvidia’s recently announced Groq LPUs, which we explored in depth here. These chips do not rely on conventional DRAM and are designed to work with GPUs like Rubin to accelerate the token-generating decoding phase of the inference pipeline.

However, due to their limited on-chip SRAM, large quantities are required for this purpose. Thus, TrendForce forecasts demand of the order of “several hundred thousand units” this year, and approximately double that in 2027.

On a related note, TrendForce also warned this week that consumer DRAM prices could rise another 45-50% in the second quarter. This is on top of the 75-80% price increase we saw in the first quarter.

Memory prices have risen in recent months, and many products like DDR5 and SSDs now sell for more than triple what they cost at retail this time last year.

As we have previously reported, the demand for AI infrastructure, combined with the highly cyclical nature of memory markets, is largely responsible for the sky-high prices.

We’ve reached out to Nvidia for comment on potential delays to its Rubin lineup; we’ll let you know if we have any news. ®