Nvidia's Data Centre Revenue Overtakes the Entire Rest of the Company — Again
Data centre revenue up 427% year-on-year
$22.6bn in a single quarter — 85% of total company revenue and well ahead of consensus estimates.
Blackwell Ultra already supply-constrained before launch
Mass production begins H2 2025 but orders already exceed available supply.
Nvidia claims 90%+ share of AI training workloads
AMD's MI300X is gaining ground but Huang says inference growth will dwarf anything seen so far.
Nvidia reported quarterly earnings on Wednesday that once again defied the expectations of analysts who thought the AI infrastructure boom was beginning to plateau. Data centre revenue hit $22.6 billion for the quarter, up 427% year-on-year and accounting for more than 85% of total company revenue. The GPU maker's stock rose 8% in after-hours trading.
The headline number masks a compositional shift that has significant implications for competitors. A growing share of Nvidia's data centre revenue now comes not from individual H100 chips but from its NVL rack-scale systems — integrated computing units that bundle GPUs, networking and cooling into a single purchasable unit. Average selling prices for these systems run into the millions of dollars per rack.
Jensen Huang, who has delivered one of the great performances in recent corporate history simply by refusing to undersell his own products, used the earnings call to preview Blackwell Ultra, the next iteration of Nvidia's AI chip architecture. Mass production is scheduled to begin in the second half of 2025, and the company says it is already supply-constrained against existing orders.
Microsoft, Google, Amazon and Meta were identified as the four largest customers, collectively accounting for roughly 40% of data centre revenue. Each is simultaneously Nvidia's customer and a potential competitor — all four are developing their own AI accelerator chips. AMD's MI300X, the most credible third-party alternative, has gained traction at several hyperscalers but Nvidia puts its own market share in training workloads at above 90%.
The gaming division, which defined Nvidia for most of its history, posted $2.9 billion — respectable in isolation but now an afterthought. The automotive segment, long discussed as a future growth driver, contributed $329 million, its strongest quarter yet but still a rounding error against the data centre numbers.
A question that has hung over Nvidia for two years is whether demand is being pulled forward by hyperscaler capex cycles, creating a cliff when the current build-out phase ends. Huang's answer, repeated with characteristic confidence, is that inference — running AI models rather than training them — will require orders of magnitude more compute than training and has barely begun to scale.