For the second time in three months, US graphics chip giant Nvidia reported blowout quarterly earnings on Aug 23. The company is thriving due to the high demand for AI chips from tech giants like Microsoft, Google and Amazon.com to start-ups and sovereign wealth funds in Saudi Arabia and the United Arab Emirates.
Everyone is trying to hoard AI chips to sell to smaller companies that desperately need them. Nvidia’s quarterly revenues of US$13.5 billion ($18.4 billion) were up 101% over the past year and 88% over the previous quarter, while gross margins of 71.2% were up from consensus estimates of just 60%.
What’s driving the demand? Microsoft — which needs to support the chatGPT infrastructure — has been ramping up generative AI co-pilots and grew its total capital expenditure by US$2.3 billion, or 35% sequentially, in the last quarter. Microsoft is now Nvidia’s largest customer, accounting for 22% of the chip giant’s revenues in the last quarter.
Other global hyperscalers — or firms like Amazon Web Services, Google Cloud, Oracle and Meta Platforms that offer large-scale cloud services solutions to their customers and can rapidly scale their infrastructure to accommodate the growing demands of their users or their own — together bought slightly more Nvidia GPU or graphic processing unit (GPU) chips than Microsoft last quarter.
For its part, Nvidia believes over the long term, the total addressable market for AI will exceed US$600 billion-US$300 billion in chips and systems, US$150 billion in generative AI software, and US$150 billion in omniverse enterprise software to enable development, deployment, and management of advanced 3D applications.
Manuvir Das, Nvidia’s vice president of enterprise computing, told a Goldman Sachs Tech conference last week that “accelerated computing” is not just about the chips but the whole stack.
See also: UBS-Credit Suisse integration opens up new tech for bigger plans
“If you think about the traditional computing systems, what has changed over the decades is simply the location — you’re in the cloud, you’re doing it on your phone, but it’s essentially the same style of computing” through a central processing unit or the CPU.
Increasingly, companies are using computing to do everything in the cloud. “More computing means you need more data centres, more energy, more horsepower, and it’s not sustainable.” What Nvidia is trying to do, Das argues, is do it better and more sustainably. He adds: “We’re saying that with accelerated computing, or the same footprint, we can do 10 times, 100 times the work — and that’s going to be the only way.” With its AI chipsets, Das said, Nvidia is helping companies “go digital and grow more efficiently in previously unimaginable ways.”
Right place, right time
Founded by chip design engineers Jensen Huang, Chris Malachowsky and Curtis Priem in 1993, Nvidia has found itself in the right place over the past two decades. In 2000, Nvidia rightly bet that gaming chips would be the next big growth driver for the semiconductor industry. Its chips were in Microsoft’s earliest Xbox game consoles. About eight years ago, Nvidia rightly bet that crypto mining was a nice niche for its expanding GPUs. For the past five years, Nvidia has been investing to improve its GPUs for accelerated computing in artificial intelligence applications. Nvidia is an opportunistic chip firm that bet three times, and each time, its bet has paid off in a big way, notes Aswath Damodaran, a professor at New York University’s Stern Business School. “First time around, you could say they were lucky, but by the time you see them doing it for the third time, you know it is by design,” he argues.
See also: Google arguments draw scepticism from judge in ad tech case
“Nvidia also had two near-death experiences since it listed in 1999, with its stock price plunging 80% in 2000 and 2001,” Damodaran notes. ‘’But it bounced back, made new bigger bets and eventually succeeded.” In 2018, Nvidia stock was down 40%, and between November 2021 and October 2022, the chip design firm’s shares plunged nearly 70%, only to nearly quintuple since then. Investors abandoned Nvidia, but the management knew their AI chip had incredible potential. It was only a matter of time before the world woke up to realise what Nvidia had in its hands.
Yet, the chip industry is notorious for its feast-to-famine cycles. Chip makers discover a great niche, sell a ton of chips and make tens of billions of dollars, only to see competitors flock in and make that segment of semiconductors a commodity. The next wave of chip firms is forced to move on to a new niche.
Twenty years ago, Japanese chip makers were feared in Silicon Valley. Today, Japan is a semiconductor minnow behind the US, China, Taiwan and South Korea. Intel was the world’s largest chipmaker from the early 1980s until a few years ago. Today, Nvidia is eight times Intel’s size. Communications chip maker Broadcom, memory chip maker Samsung Electronics, chip foundry Taiwan Semiconductor Manufacturing or TSMC and Nvidia’s closest rival Advanced Micro Devices or AMD are all far bigger than Intel, which has been struggling for relevance.
Nvidia is thriving because it provides the tools for the AI boom. In 1999, at the height of the dot com boom, the company that reached the highest valuation wasn’t an internet firm like Yahoo or an e-commerce firm like Amazon.com but Cisco, which provided routers and networking gear for Internet service providers. It was a classic picks-and-shovel player.
When I talk to Venture capitalists and tech entrepreneurs in the US, they often mention the California gold rush that began in 1848. Over 300,000 people were lured to California as people searched for gold. Most money wasn’t made by mining companies that dug out the gold but the ones that provided picks, shovels and pans to dig the shiny metal out of the ground. Another big beneficiary is Levi Strauss, which sold jeans to gold diggers.
Growth rate
Will the current AI spending spree end like the dot com bubble burst in 2000 when fibre optic firms like Global Crossing spent tens of billions building infrastructure to provide broadband Internet? “We don’t think so,” says Mark Lipacis, semiconductor analyst for Jefferies & Co in San Francisco.
“The growth rates of Nvidia today and the fibre optics firms in the late 1990s are similar, but what is different is the maturity of the business models,” he says. “In the late 1990s, competitive carriers put dark fibre in the ground and equipment in central offices before business models were proven — “build it, and they will come”. Today, many companies are generating a return using generative AI through product enhancements, productivity gains, or outright cost reductions,” he says. After the dot com bubble burst, emerging Internet firm Google bought a lot of that fibre, and the main beneficiary of the overbuilding of fibre was YouTube, which needed bandwidth to push millions of videos.
Sink your teeth into in-depth insights from our contributors, and dive into financial and economic trends
Unlike all the dark fibre that was being laid out under the sea and underground in the late 1990s, which wasn’t used until 10 years later, Nvidia and other AI chip makers intend to roll out newer versions of more powerful and sophisticated chips, every year that companies like Microsoft, Google, Amazon, Meta Platforms and others can use in their new generative AI applications.
You can’t leave an AI chip for 10 years and hope that someday, like fibre, it too will be put to good use. So, as more sophisticated chips are rolled out, start-ups and tech giants will be forced to develop new use cases for the chips.
Chip users like Tesla, Google, Amazon and Microsoft, and chip makers AMD and Intel, are readying their own AI chips, though it might be at least 18 months before a viable alternative to Nvidia’s chips hits the market. By then, Nvidia would have rolled out two more iterations of its own AI chips.
As the world’s biggest user of AI chips, Microsoft has also emerged as a huge investor in AI chip start-ups, challenging Nvidia. Last week, the software powerhouse joined Singapore’s Temasek Holdings in a US$110 million funding round for d-Matrix, a Silicon Valley-based AI chip start-up. A year ago, d-Matrix — which builds AI chips for data centres — raised US$44 million.
Other companies are raising money at stratospheric valuations. Start-up cloud GPU provider CoreWeave, a large customer of Nvidia’s AI chips, is reportedly looking at a stake sale later this month that would value it up to US$8 billion, up from the US$2.2 billion it was valued at in April. Privately-held CoreWeave, which offers services to support high-performance computing, has gained significant traction amid the recent generative AI boom.
The key question for Nvidia, says Pierre Ferragu, tech hardware analyst for NewStreet Research, “is how much more GPU spending hyper scalers and other buyers can afford beyond 2024.”
Bubble bursting?
What of Nvidia’s runaway stock? Nvidia stock is up 230% this year, the chip index SOX is up 46.2% year-to-date, and the broader S&P 500 index is up 17% since January.
Are we in an AI bubble that is about to burst soon? Tracy Rasgon, semiconductor analyst for Sanford C. Bernstein & Co, sees Nvidia’s revenues surging to US$3.48 billion in the current fiscal year ending January 2024 from US$26.97 billion in the last fiscal year, accelerating to US$69.05 billion next year and its annual free cash flow growing from US$3.8 billion last year to US$34.97 billion next year.
Rasgon has 12 monthly price targets of US$675 on the hot chip stock. Bernstein analysts are a bit stingy on the price target. At least one analyst, Rosenblatt Securities’ Hans Mosesmann, has a 12-month price target of US$1,100 or a 134% upside.
UBS chip analyst Timothy Arcuri forecasts Nvidia’s revenues to surge to US$97.4 billion in the next fiscal year and net earnings to top US$53.5 billion. In perspective, just three years ago, Nvidia had a mere US$16.6 billion in annual revenues and US$6.2 billion in net profits. “Everyone has been looking for ways to play AI that aren’t as expensive as Nvidia, given the run this year,” Rasgon notes.
Yet, the Bernstein analyst argues, buying Nvidia “itself remains the best way to accomplish that given the magnitude of earnings revisions.” Nvidia’s stock, he believes, “will still come out cheaper than it was” before its Aug 23 earnings announcement. That’s because analysts have been rushing to revise Nvidia’s earnings ever upward faster than the stock’s price appreciation itself in recent months.
Nvidia’s stock closed at US$470.61 on Sept 6, or just 31.6 times the current fiscal year’s consensus forecast earnings. In contrast, retailer Costco Wholesale Corp’s stock trades at 35.6 times estimated earnings and has net margins of 2.55%.
Even its closest chip peer, Advanced Micro Devices (AMD), trades 32.5 times the forecast earnings. Did I mention that AMD has gross margins of around 45.5% compared to Nvidia’s software, like 71.5% margins?
By the way, Microsoft, the world’s largest software firm, reported gross margins of 68.9% in the last quarter. When a semiconductor hardware firm has better margins than the world’s top software firm, investors sit up and take notice.
Make no mistake: the AI revolution is almost here. After smartphones were launched in 2007, it took a few years before services like ride-hailing, food delivery and short-term home rental like Airbnb emerged. After the advent of 5G in 2018, it took years before self-driving robotaxis service began in California. It will be a while before ubiquitous AI-linked services become apparent.
Assif Shameen is a technology and business writer based in North America