Continue reading this on our app for a better experience

Open in App
Floating Button
Home Views Artificial Intelligence

Nvidia's moat is its software and ecosystem

Assif Shameen
Assif Shameen • 10 min read
Nvidia's moat is its software and ecosystem
Competition is coming and Nvidia will increasingly have a smaller piece of an expanding pie. / Photo: Bloomberg
Font Resizer
Share to Whatsapp
Share to Facebook
Share to LinkedIn
Scroll to top
Follow us on Facebook and join our Telegram channel for the latest updates.

Global markets have more recently been about just one name: Nvidia Corp. “As goes Nvidia, so go the markets” became the common maxim. On Feb 22, the graphics processing chip behemoth reported its earnings for the fourth quarter ended January 2024. Another blowout report and a hefty forward guidance raise were not entirely unexpected. Indeed, the concern was whether it could live up to investors’ sky-high expectations. Nvidia delivered more than what anyone was anticipating. 

The market had not priced in the magnitude of the “beat and raise” by the firm, which has become the icon of the ongoing artificial intelligence, or AI, boom. Revenue was up 265% year-on-year to US$22.1 billion ($29.7 billion); sales to data centres, including chips used in generative-AI computing, surged a whopping 409% to US$18.4 billion; and adjusted earnings per share rose 486% to US$5.16. Gross margins topped 76%. Despite resetting expectations higher every quarter, for a year now, Nvidia has found a way to beat them by a much wider margin. For the current quarter, it reset the bar still higher — guiding earnings 10% above consensus estimates. 

Nvidia’s stock surged over 16% the next day, and its market value crossed the US$2 trillion mark and kept soaring for days before taking a breather. Almost every investor seems to be chasing the hottest stock on earth. Even those who believe Nvidia shares have gotten ahead of themselves are trying to find ways to ride the AI wave by buying other AI-related stocks. “Accelerated computing and generative AI have hit the tipping point,” Jensen Huang, Nvidia’s CEO, said at the earnings call. “Demand is surging worldwide across companies, industries and nations.” Wait times are expected to be long when the chip firm introduces its new HGX H200 chip later this year. 

Detractors say Nvidia’s shares are way overvalued and that the AI bubble is about to pop. They compare the recent market rally to the dotcom mania of the late 1990s. Moreover, they allege that investors have thrown caution to the wind at a time when America is on the precipice of a recession and the US Federal Reserve is unlikely to cut interest rates anytime soon, which in turn will pull the rug out from under the market. 

Semiconductor chips are a highly cyclical business. Chipmakers swing from feast to famine and to feast again. While demand for its AI chips currently far exceeds supply, the concern is that eventually, supply will catch up with demand, leading to falling chip prices, margin compression and lower profits at Nvidia.

Not the 2000 Tech Bubble
For their part, the bulls say the AI boom has only just begun. The dotcom boom started in 1995 but the bubble did not burst until March 2000. Parallels have been drawn between Nvidia and the darling of the dotcom era, networking gear maker Cisco Systems. Like Nvidia, Cisco was seen as providing “picks and shovels” for the internet boom. Cisco’s market capitalisation peaked at US$546 billion as its stock traded at 130 times forward earnings. As the bubble burgeoned, Cisco bought over 75 companies, most of them through share swaps using its overvalued stock as currency. 

See also: Thai exchange to use AI to improve oversight of listed firms

Yet, Nvidia is no Cisco. It has made just one major acquisition — networking firm Mellanox Technologies — over the last decade. It paid cash rather than issuing a ton of shares. Indeed, Nvidia is one of the cheaper large-cap tech stocks despite having surged 71% over the past nine weeks and 253% over the past year. Even as its shares have soared, its earnings have risen faster. It trades at 33 times this year’s forecast earnings. In comparison, electric vehicle pioneer Tesla trades at 63.2 times estimated earnings, e-commerce firm Amazon.com trades at 43 times this year’s earnings, and software giant Microsoft trades at 34.1 times earnings. None of them can match Nvidia’s pace of growth or its huge gross margins. 

Any comparison with the dotcom bubble 24 years ago is wide off the mark. Just before the early 2000 tech wreck, the US benchmark S&P 500 index was trading at over 32 times forward earnings, and earnings estimates were generally flat or being revised downwards. The S&P 500 is now at just under 23 times this year’s earnings. Analysts are currently revising earnings upwards for US firms across a range of sectors. Moreover, just as the dotcom bubble was inflating in 1999, the Fed was busy raising interest rates. Now it is on the verge of cutting them. 

Unlike other speculative tech booms — like blockchain seven years ago or 3D printing 15 years ago — AI is the real thing. “When we had the internet bubble the first time around, that was hype,” Jamie Dimon, CEO of the world’s largest banking group JPMorgan Chase, said when asked about the growing AI frenzy. “This is not hype. It’s real.” Microsoft, Google, Meta Platforms and Apple are ordering AI chips and deploying them right away for training and inferences. Meta has ordered 350,000 AI chips worth over US$10 billion from Nvidia.  

See also: 82% of Southeast Asia CFOs and tax leaders believe GenAI will drive efficiency and effectiveness: EY report

Here are some ways that AI is helping companies: It helps Bank of New York Mellon to do mundane or repetitive tasks. The bank’s research analysts, who used to get up at 4am to write reports, now wake up at 6am because the generative AI gives them a rough draft to start with, as well as a bunch of data that enables them to deliver their research reports to clients before they arrive at their desk at 8am. Consumer goods firm Procter & Gamble, which makes Crest+ toothpaste and Gillette razors, uses AI to optimise truck scheduling and minimise idle time for drivers and more efficiently manage its supply chain resulting in US$200 million of savings. Logistics firm CH Robinson Worldwide is using generative AI to translate the structured and unstructured emails, PDFs and Excel files that it receives from customers and vendors into actual orders on its system. 

The key trend you need to watch is how quickly AI players move from training data to inference. When OpenAI unveiled its chatbot ChatGPT in November 2022, AI chips were mostly training data. Increasingly, a chunk of chips are being used for inference, a bigger and more lucrative market than training. Confused by all that jargon? Let me explain. When you are learning from data, you develop pattern recognition and intelligence. Your newfound knowledge is used to make sense of real world scenarios which help you make decisions in the post-learning phase of inference. 

Training data is just an expense for the likes of Microsoft and Meta. Inference allows them to generate income by delivering a service. OpenAI’s new GPT-5, which has a deeper grasp of language’s context, subtleties and emotions, will allow Microsoft to recoup some of billions it has invested in AI so far. Nvidia chips have a near monopoly in training data. In inference, it faces competition from rivals like Qualcomm and Intel. Nvidia now says up to 40% of new orders for its AI chips are for use in inference. 

But the high cost of AI chips as well as Nvidia’s near monopoly and its huge gross margins are forcing larger users to look at other alternatives and luring new competitors. Lisa Su, CEO of rival chipmaker Advanced Micro Devices, estimates that AI chip demand could balloon to US$400 billion by 2027. Her firm recently began ramping up its own MI300 AI chip but is unlikely to pose a serious challenge to Nvidia anytime soon. Intel and others are also readying competing chips. Large users like Microsoft and Google are designing their own AI chips to slash their reliance on Nvidia, as well as the huge premium they pay for graphic chips. 

Nvidia’s edge is not just in its ability to design the best-of-breed AI chips, but also in its software that runs those chips as well as the systems that it builds for accelerated computing. Nvidia’s increasingly sticky ecosystem locks in customers by combining its chips, software tools and services. Its software is not easy to learn and switching it off can be a pain for companies.

Think of Nvidia’s ecosystem the way you think of iPhone maker Apple’s. Anyone can make a smartphone. Apple not only designs the iPhone but also makes the operating system software, apps, as well as an array of services from video games to music, movies and TV shows for AppleTV+, cloud storage, fitness and news apps. The integrated ecosystem makes the iPhone work well with the Apple Watches, iPads, AirPods and MacBooks, as well as with apps and services. Once you check into the Apple system, you will probably never want to leave. 

“Computing eras have been dominated by ecosystems like IBM in mainframes or Apple in smartphones,” says Jefferies analyst Vedvati Shrotre. “Nvidia’s domination of the AI ecosystem has led to a virtuous circle of software developers and platform suppliers embracing it.”

Sink your teeth into in-depth insights from our contributors, and dive into financial and economic trends

Competitive moats
Nvidia’s software, combined with the network affects, gives it a huge moat. Large users such as Google and Amazon may be able to design sophisticated AI chips themselves but will soon find that switching costs are prohibitive. Big Tech customers use Nvidia’s proprietary AI software tools like Cuda, a parallel computing platform which helps boost the capabilities of its graphical processing units, or GPU chips. Though Microsoft and Google are formidable software players, Nvidia’s software edge cannot be easily overcome. Even if Microsoft designs its own chip, it would be too hard for it to switch to another chip and use another set of software and AI tools and retrain a whole bunch of AI engineers to use them. Big Tech players will use their AI chips mostly for uses where they are not currently deploying Nvidia’s chips and software.

There are also compatibility issues particularly if one part of an AI cloud server is using Nvidia chips and software tools, and another uses AMD chips and perhaps Microsoft or Amazon software. Imagine the time, effort and money that would be spent to integrate all that and make them work in harmony. Nvidia also benefits from the network effects. The more companies and software engineers use its products, the better its products will get. And the better Nvidia’s chips get, the harder it becomes for competitors to take a big chunk of market share. 

That said, competition is coming and Nvidia will increasingly have a smaller piece of an expanding pie. The chip giant booked US$18.4 billion from data centre revenues last quarter, or a US$74 billion annual run rate. If the AI chips for the data centres market grow to US$400 billion by 2027, as industry insiders expect, and Nvidia’s market share falls to 65% from over 90%, it would still mean US$260 billion worth for Nvidia. Even if its share of the pie falls to just half of the total market, it would be a near-tripling of its annual chip sales. Add in software and systems, and you begin to understand why Nvidia is no run-of-the-mill cyclical chip firm but one that has a huge opportunity ahead of it with its sticky ecosystem.  

Assif Shameen is a technology and business writer based in North America

×
The Edge Singapore
Download The Edge Singapore App
Google playApple store play
Keep updated
Follow our social media
© 2024 The Edge Publishing Pte Ltd. All rights reserved.