The company got $11 billion of revenue from Blackwell in the fourth quarter, something Nvidia described as the “fastest product ramp” in its history. “Demand for Blackwell is amazing,” Chief Executive Officer Jensen Huang said in a statement, helping send the shares up about 2.5% in late trading.
The outlook comes at a shaky time for the AI industry. Nvidia shares have dipped this year on concerns that data center operators will slow spending. Chinese startup DeepSeek also sparked fears that chatbots can be developed on the cheap, potentially reducing the need for Nvidia’s powerful chips for AI.
Against that backdrop, Nvidia signaled that growth is still strong — even if it didn’t deliver the kind of blowout results that have become its hallmark.
Sales will be about $43 billion in the fiscal first quarter, which runs through April, Nvidia said in the statement. Analysts had estimated $42.3 billion on average, with some projections ranging as high as $48 billion. Gross profit margins will be a bit short of projections.
Though the company’s fiscal fourth-quarter sales topped analysts’ estimates, they did so by the smallest margin since February 2023. Earnings, meanwhile, had the narrowest amount of upside since November 2022, according to data compiled by Bloomberg.
The stock had been down 2.2% this year, following stratospheric gains in 2023 and 2024 that turned Nvidia into the world’s most valuable chipmaker.
Nvidia has been the biggest beneficiary of a massive surge in AI spending, doubling the size of its revenue the past two years. Many of the largest technology companies are pouring tens of billions of dollars into data center hardware, and Nvidia is the dominant seller of processors that create and run AI software.
Along the way, Nvidia and its CEO have become synonymous with the AI revolution — and the biggest bellwether for how it’s progressing. Huang has spent much of the past two years crisscrossing the world as an evangelist for AI technology and believes it’s still in the early stages of spreading throughout the economy.
Sales in the fourth quarter, which ended Jan. 26, rose to $39.3 billion. That matched estimates, though some projections ranged as high as $42 billion. Underlining just how quickly the company has grown: Its latest quarterly sales were bigger than Nvidia’s annual revenue two years ago, when it totaled $27 billion.
Profit was 89 cents a share, minus certain items. Wall Street was looking for 84 cents.
The data center unit, by far Nvidia’s biggest source of revenue, generated sales of $35.6 billion. That beat the average estimate of $34.1 billion. Gaming-related sales — once Nvidia’s core business — amounted to $2.5 billion. Analysts projected $3.02 billion on average. Automotive was $570 million.
Heading into the earnings report, analysts had expressed concern about near-term growth in Nvidia’s biggest business, which serves data center customers. The big question was whether supply constraints and a shift to the company’s latest design, Blackwell, would slow growth. The new technology is more sophisticated, bringing manufacturing challenges.
DeepSeek added to the worries after releasing a powerful AI model that it said required far fewer resources to create. The announcement in late January led to a widespread selloff in AI-related shares. Nvidia shed a staggering $589 billion of capital in one day of trading, a record for the markets.
But key Nvidia customers, such as Microsoft Corp., have maintained their capital expenditure plans, suggesting that the AI spending surge will remain strong.
Nvidia has only missed analysts’ estimates on quarterly revenue once in the past five years. And it has exceeded expectations by more than 10% in recent periods, creating a high bar for its performance.
Its data center division alone now has more revenue than rivals Intel Corp. and Advanced Micro Devices have in total, combined.
Nvidia made its name by selling graphics processors, but discovered that the technology also has applications for AI. Its chips help software models during the training process, when they learn to recognize and respond to real-world inputs. Nvidia’s components are also used in systems that then run the software, a stage known as inference, and help power services such as ChatGPT.