Market Watch

Nvidia Data Center Revenue Soars 154% Amid AI Demand Surge

LinkedIn Google+ Pinterest Tumblr

Nvidia has reported a remarkable 154% increase in revenue for its data center business during the second quarter of 2025. This impressive growth resulted in generating $26.3 billion, contributing to a total record quarterly revenue of $30 billion, up by 15% from the first quarter and a substantial 122% from the previous year.

Jensen Huang, founder and CEO of Nvidia, attributed this success to the widespread shift among data center operators towards accelerated computing. “Nvidia achieved record revenues as global data centers are in full throttle to modernise the entire computing stack with accelerated computing and generative AI,” he stated.

The financial results highlighted the strong demand for Nvidia’s Hopper graphics processing unit (GPU) computing platform. This technology is being used extensively in training and inferencing for large language models, recommendation engines, and generative AI (GenAI) applications. Sequential growth was driven by consumer internet and enterprise companies, with cloud service providers accounting for 45% of the spending and consumer internet and enterprise companies making up 50%. Networking products also saw a 16% increase in revenue.

During an earnings call, Huang explained the need for a transition from general-purpose computing to accelerated computing. He emphasized that CPU scaling was slowing down while computing demand continued to grow significantly, potentially doubling each year. This imbalance necessitates a new approach to avoid escalating computing costs and energy consumption in data centers.

Over the past year, Huang has been advocating for the benefits of accelerated computing powered by data center GPUs. He pointed out, “It’s not unusual to see someone save 90% of their computing cost. And the reason for that is, of course, you just sped up an application 50 times, [so] you would expect the computing cost to decline quite significantly.”

Huang envisions a future where every single data center incorporates GPU technology. “The world builds about $1 trillion worth of data centers – $1 trillion worth of data centers in a few years will be all accelerated computing.”

The strategic direction aims to increase computational power in a sustainable manner while reducing costs, avoiding what Huang refers to as “computing inflation.” He also highlighted that liquid-cooled data centers offer three to five times the AI throughput of traditional cooling systems. According to Huang, “Liquid cooling is cheaper and allows you to have the benefit of this capability we call NVLink, which allows us to expand it to 72 Grace Blackwell packages, which has essentially 144 GPUs.”

Write A Comment