Continue reading this on our app for a better experience

Open in App
Floating Button
Home Capital Artificial Intelligence

The AI chip behind Nvidia’s supersonic stock rally

Ian King
Ian King • 5 min read
The AI chip behind Nvidia’s supersonic stock rally
Nvidia’s H100 processor enabled a new generation of AI tools / Photo: Bloomberg
Font Resizer
Share to Whatsapp
Share to Facebook
Share to LinkedIn
Scroll to top
Follow us on Facebook and join our Telegram channel for the latest updates.

When a new gadget sets the technology world alight, it’s usually a consumer product like a smartphone or a gaming console. This year, tech watchers are fixating on an obscure computer component that most people will never even see. The H100 processor has enabled a new generation of artificial intelligence tools that promise to transform entire industries, propelling its developer Nvidia Corp. past Apple Inc. to make it the world’s second-most valuable company. It’s shown investors that the buzz around generative AI is translating into real revenue, at least for Nvidia and its most essential suppliers. Demand for the H100 is so great that some customers are having to wait as long as six months to receive it. 

1. What is Nvidia’s H100 chip?
The H100, whose name is a nod to computer science pioneer Grace Hopper, is a beefier version of a graphics processing unit that normally lives in PCs and helps video gamers get the most realistic visual experience. It includes technology that turns clusters of Nvidia chips into single units that can process vast volumes of data and make computations at high speeds. That makes it a perfect fit for the power-intensive task of training the neural networks that underpin generative AI. The company, founded in 1993, pioneered this market with investments dating back almost two decades, when it bet that the ability to do work in parallel would one day make its chips valuable in applications outside of gaming. 

2. Why is the H100 so special?
Generative AI platforms learn to complete tasks such as translating text, summarizing reports and synthesizing images by ingesting vast quantities of preexisting material. The more they see, the better they become at things like recognizing human speech or writing job cover letters. They develop through trial and error, making billions of attempts to achieve proficiency and sucking up huge amounts of computing power along the way. Nvidia says the H100 is four times faster than the chip’s predecessor, the A100, at training so-called large language models, or LLMs, and is 30 times faster replying to user prompts. Since releasing the H100 in 2023, Nvidia has announced versions that it says are even faster — the H200 and the Blackwell B100 and B200. For companies racing to train LLMs to perform new tasks, that growing performance edge can be critical. Many of Nvidia’s chips are seen as so key to developing AI that the US government has restricted the sale of the H200 and several less capable models to China. 

3. How did Nvidia become a leader in AI?
The Santa Clara, California-based company is the world leader in graphics chips, the bits of a computer that generate the images you see on the screen. The most powerful of those are built with thousands of processing cores that perform multiple simultaneous threads of computation, modelling complex 3D renderings like shadows and reflections. Nvidia’s engineers realized in the early 2000s that they could retool these graphics accelerators for other applications, by dividing tasks up into smaller lumps and then working on them at the same time. AI researchers discovered that their work could finally be made practical by using this type of chip. 

4. Does Nvidia have any real competitors?
Nvidia now controls about 92% of the market for data centre GPUs, according to market research firm IDC. Dominant cloud computing providers such as Amazon.com Inc.’s AWS, Alphabet Inc.’s Google Cloud and Microsoft Corp.’s Azure are trying to develop their own chips, as are Nvidia’s rivals Advanced Micro Devices Inc. and Intel Corp. Those efforts haven’t made much headway in the AI accelerator market so far, and Nvidia’s growing dominance has become a concern for industry regulators. \

5. How does Nvidia stay ahead of its competitors?
Nvidia has updated its offerings, including software to support the hardware, at a pace that no other firm has yet been able to match. The company has also devised various cluster systems that help its customers buy H100s in bulk and deploy them quickly. Chips like Intel’s Xeon processors are capable of more complex data crunching, but they have fewer cores and are much slower at working through the mountains of information typically used to train AI software. 

See also: ChatGPT’s US$8 tril birthday gift to Big Tech

6. How do AMD and Intel compare to Nvidia?
AMD, the second-largest maker of computer graphics chips, unveiled a version of its Instinct line last year aimed at the market that Nvidia’s products dominate. At the Computex show in Taiwan in early June, AMD Chief Executive Officer Lisa Su announced an updated version of its MI300 AI processor would go on sale in the fourth quarter and outlined that further products will follow in 2025 and 2026, showing her company’s commitment to the product. Intel is now designing chips geared for AI workloads but acknowledged that, for now, demand for data centre graphics chips is growing faster than for the processor units that were traditionally its strength. Nvidia’s advantage isn’t just in the performance of its hardware. The company invented something called CUDA, a language for its graphics chips that allows them to be programmed for the type of work that underpins AI programs. 

7. What is Nvidia planning on releasing next?
The most anticipated release is the Blackwell, and Nvidia has said it expects to get “a lot” of revenue from the new product series this year. Meanwhile, demand for the H series hardware continues to grow. Chief Executive Officer Jensen Huang has acted as an ambassador for the technology and sought to entice governments, as well as private enterprise, to buy early or risk being left behind by those who embrace AI. Nvidia also knows that once customers choose its technology for their generative AI projects, it’ll have a much easier time selling them upgrades than competitors hoping to draw users away. – Bloomberg Quicktake

 

×
The Edge Singapore
Download The Edge Singapore App
Google playApple store play
Keep updated
Follow our social media
© 2024 The Edge Publishing Pte Ltd. All rights reserved.