Continue reading this on our app for a better experience

Open in App
Floating Button
Home Views Artificial Intelligence

Ice cream and the cooling of the AI frenzy

Assif Shameen
Assif Shameen • 10 min read
Ice cream and the cooling of the AI frenzy
Data centres are notorious energy guzzlers / Photo: Bloomberg
Font Resizer
Share to Whatsapp
Share to Facebook
Share to LinkedIn
Scroll to top
Follow us on Facebook and join our Telegram channel for the latest updates.

Ice cream is a terrible business. Just ask the frustrated CEOs of the world’s biggest ice cream sellers — Switzerland’s Nestle, which owns Movenpick, Magnolia and North American rights for Haagen Dazs; France’s Danone SA; General Mills, which owns the rights to Haagen Dazs in Europe, Asia and the Middle East; London-based Unilever, which owns Ben & Jerry’s, Brewers and Klondike; or Inspire Brands, which own Baskin Robbins. They all see ice cream as an albatross around their necks.

Global annual sales of ice cream were up just 4% in US dollar terms last year. In volume terms, growth has been falling for years. Because prices are rising, the value of ice cream sales still managed to eke out a slight growth. The biggest consumers of ice cream are not kids but overweight or obese adults who gorge on salty snacks, sugary drinks and ice cream. The bad news for the Nestles, Unilevers and Danones of the world is not that fewer people are licking ice creams but that GLP-1 obesity drugs are weighing on the business. Morgan Stanley in a recent report cites ice cream as the top item that users of obesity drugs eliminated from their grocery shopping. As the popularity of GLP-1 drugs soars, a meltdown is looming for ice cream.

Rising refrigeration costs
Ice cream makers could live with a 20% decline in volumes over the next few years if only they could keep raising prices. Here is the problem: ice cream is a low-margin business where costs are soaring faster than Unilever or Nestle can raise prices. The biggest cost is not ingredients but the cost of keeping it cool or refrigeration. Rising energy costs depress the margins of ice cream makers. When ice cream volumes were rising, refrigeration costs were not a big concern. Ice cream makers are also staring at the environmental impact of refrigeration. The cooling industry is incredibly polluting and accounts for 10% of global carbon emissions or three times those produced by airlines and shipping combined. As temperatures rise due to climate change, the demand for cooling will increase too.

Not surprisingly, ice cream makers are looking to exit the business. In mid-March, London-based fast-moving consumer goods Unilever, which sells Dove soaps, Knorr soup and condiments, Lipton’s tea and Hellman’s ketchup and mayonnaise, announced it would spin off its ice cream business that it had painstakingly built over the past three decades through acquisitions. Ice cream has been weighing on Unilever’s stock which is down 26% since its August 2019 peak while global equity benchmark S&P 500 Index is up 78% in the same period.

Unilever’s total global sales last year grew 7% with volume growth of just 0.2%. Essentially, Unilever gets almost all of its sales gains from hiking prices rather than selling more products. Its ice cream division, which accounts for just over 11% of the firm’s total revenues, grew 2% last year even after hiking prices by over 10%. Refrigeration of ice cream made up the biggest slice of Unilever’s total global carbon footprint. The divestment of ice cream brands will help Unilever change the narrative by claiming it is now a growing company that does not rely only on increasing prices every year and can boast that it has drastically cut its carbon footprint.

Rising electricity and refrigeration costs remind me how the boom in artificial intelligence or AI is dramatically transforming the cooling industry. AI has been billed as transformative as the printing press, the steam engine, electricity, computing and the Internet. Start-ups like OpenAI, search giant Google’s owner Alphabet Inc, software powerhouse Microsoft or social media supremo Media Platform are developing large language models or deep learning algorithms that can recognise, summarise, translate, predict and generate content based on knowledge gained from massive datasets. To do that, they need a lot of computing power from AI chips like Nvidia’s H100.

See also: Thai exchange to use AI to improve oversight of listed firms

The training and inferencing of data is done in vast data centres around the world with racks of servers using powerful AI chips. These data centres are climate-controlled environments which need air conditioning and airflow management to help provide temperature and humidity optimisation specifically designed for those large servers. Compute power and cooling are the two most energy-intensive processes within the data centres.

International Energy Agency, or IEA, estimates that in data centres around the world, total electricity consumption could rise to over 1,000 TWh or terawatt-hours by 2026 from 460 TWh in 2022. One TWh is one trillion watt-hours. Let me give you an idea of just what 1 TWh of electricity can do. Well, for one thing, it can cool 500,000 homes for a whole year, light over 1 million homes for a year or fully power over 70,000 homes for an entire year. Some of the estimates for total electricity consumption by AI data centres that I have seen are as high as 2,000 TWh by 2030. These are all early estimates and I am sure they will be revised as the ongoing AI boom powers ahead.

Bank of America in a recent report estimated that power consumption of AI data centres will likely grow by 75% between 2023 and 2028 to over 870 TWh. That is equivalent to 2.8% of the estimated total global power demand in 2028. The bank estimates that 10% of global incremental power demand over the next five years — or 30% of the US and  6% of China — will come from AI data centres. In the US, the bank foresees challenges as power supply and grids run short but believes China might be better positioned with power supply to support its own AI growth.

See also: 82% of Southeast Asia CFOs and tax leaders believe GenAI will drive efficiency and effectiveness: EY report

The insatiable demand for power by AI data centres comes on the heels of the recent crypto boom. The global crypto mining industry is another huge energy user at a time when the world is grappling with climate change. The energy consumption of all crypto assets combined, or up to 0.9% of annual global electricity usage or 250 billion kilowatt-hours last year. That is almost as much energy as all the world’s data centres combined before the switch to AI data centres began.

When you visit a large AI data centre in North America or one of those that will be built in Southeast Asia over the next few years by Nvidia or Microsoft, the first thing you might notice is a ton of racks, the steel frameworks that house AI servers, cables and other equipment. The power needed to run generative AI creates a lot of heat.

Here’s what the data centres use all the energy for: Bank of America estimates up to 40% of the electricity is for AI servers while 20% is for other associated IT equipment in the data centres. Another 40% of the total power used in data centres over the next five years would be needed for cooling. AI models bring notable computing capability, at the price of higher electrical loads and heat output. More power AI chips will generate even more heat that would require even more electricity to cool those servers in the data centres. Analysts expect liquid cooling to eventually replace air cooling in data centres as it takes away more heat and consumes up to 30% less electricity.

The heat is on for data centres
Another way to deal with the rising heat in data centres is to deal with the increasing power density. For years, tech firms have been trying to cram more high-powered chips into the same amount of space. That means more power per rack and a higher computing workload that has to be accommodated into less floor space. This higher power density, however, requires more powerful cooling solutions.

There are ways to make small changes that can support airflow management in the data centres. Busways, or self-contained overhead power distribution systems that deliver power to data centre server racks can help reduce cable density and promote airflow. Smart equipment can provide information on power consumption. Another solution that data centre experts tell me is getting traction is what is dubbed rear-door cooling, which pushes airflow through the servers.

One way to avoid overheating AI servers is modular data centres. That will help equipment suppliers plan supply chains but also allow customers to quickly ramp up and meet the new demand with more standardised data centre offerings. That will need greater collaboration across the data centre supply chain to come up with new industry standards to manage higher power and rack density for AI servers.

Computational efficiency is one way to help minimise energy consumption. Chip companies like Broadcom and Marvel Technology are working on solving the problem. But Morgan Stanley notes that the AI boom also provides huge opportunities for electrical equipment firms that are working on key solutions such as Data Center Infrastructure Management software, connected equipment, racks, switch gears, and of course cooling  — the other picks and shovels of the AI revolution.

Sink your teeth into in-depth insights from our contributors, and dive into financial and economic trends

So where should investors look for opportunities to ride the AI boom aside from Nvidia, Advanced Micro Devices, AMD and Broadcom? So far, they have been chasing cloud infrastructure providers like Amazon.com, which owns AWS; software giant Microsoft, which owns Azure; or search giant owner Alphabet whose subsidiary Google Cloud is the distant third player in the arena.

One way to get exposure to the AI boom is through firms that provide building blocks to data centres that house the racks with powerful AI chips like Nvidia’s top of the H100 or its newest chips H200 and Blackwell that will start shipping later this year.

An obvious beneficiary is Ohio-based Vertiv, a provider of critical infrastructure and services for data centres. Its range of offerings includes alternate current and direct current power management products, switchgear and busbar products, thermal management products, integrated rack systems and management systems for monitoring and controlling digital infrastructure. On April 24, Vertiv announced 1Q2024 earnings rose 73%. Its stock is up 86% since early January and a whopping 590% over the past year, compared to the AI icon Nvidia which is up 66% this year and 204% over the past year.

Another is Astera Labs which is building the pipes for AI. Astera was listed on the Nasdaq on March 20. Astera Labs is at the forefront of connectivity within the data centres. Essentially, it enables high-speed data transfer and overall system bandwidth expansion within and also between data centre compute platforms or AI chips. Astera priced its IPO at US$36 ($48.98) a share, its stock surged to US$95. This past week the stock has been hovering around US$72 or twice the IPO price last month.

Every tech boom has its own darling that goes on to become a meme stock. Super Micro Computer Inc has that status in the AI era. Super Micro makes high-performance servers that use Nvidia’s graphic chips to power generative AI capabilities in data centres. It buys Nvidia’s graphic chips and designs and assembles customised servers for large customers like IBM, HP Enterprises and AT&T. That is a business model that can be easily replicated but Super Micro has long been a close partner of Nvidia which has given it access to its chips which are hard to get hold of while other large Nvidia customers often have to wait weeks or months to get their quota of chips. Many eventually just turn to Super Micro to just supply them with servers that come pre-installed with Nvidia GPUs. Super Micro’s stock is up 165% this year and 820% since January last year. Super Micro has a market capitalisation of US$44 billion. Analysts’ estimates for its earnings this year are so high that the stock is still trading at 27 times this year’s earnings. It will take more than cutting-edge refrigeration technology to cool the AI frenzy.

Assif Shameen is a technology and business writer based in North America

×
The Edge Singapore
Download The Edge Singapore App
Google playApple store play
Keep updated
Follow our social media
© 2024 The Edge Publishing Pte Ltd. All rights reserved.