The Manila Times

AI boom catapults Nvidia into tech’s big league

- Richard Waters in San Diego Additional reporting by Michael Acton in San Francisco

TWO years ago, Nvidia made most of its money selling graphics cards. It was a household name only to the most dedicated PC gamers.

Today, the chip designer is one of the world’s most valuable companies, the chief beneficiar­y of an artificial intelligen­ce boom that has transfixed Silicon Valley and Wall Street, and an avatar of the new tech economy that is expected to dominate the next decade.

Last month, it turned in the fourth quarter in a row of blowout earnings that have made it the unchalleng­ed leader in the infrastruc­ture on which the coming era of AI is being built.

Thanks to widespread use of its chips to train and run the large language models that underpin generative AI, Nvidia’s sales to data centre customers were five times higher than a year before, while its after-tax profits jumped from $1.4bn to more than $12bn — figures in excess of even the most optimistic forecasts.

The stunning results pushed Nvidia to an intraday valuation of $2tn, making it the third most valuable tech company after Apple and Microsoft. The jolt from its earnings also touched off a wider stock market rally, pushing the S&P 500 index above 5,000 to a new record high.

But even as Nvidia exhausted the superlativ­es on Wall Street, the risks continue to loom large. Chip companies are prone to deeply cyclical swings in demand as investment booms rise and fall, leaving them vulnerable to severe setbacks — a fact unlikely to be lost on more seasoned Nvidia investors, who lived through a 65 per cent collapse in its share price as recently as 2022.

A stampede by new investors into the stock has also left it increasing­ly vulnerable to even small perceived setbacks. However impressive its business performanc­e, it has become a prisoner of the overwrough­t stock market expectatio­ns its success has fuelled, according to Pat Moorhead, a US chip analyst. “How do you keep crushing earnings [estimates] consistent­ly, so many times? How do you keep that going?” he says.

The latest evidence of Nvidia’s dizzying rise has brought two overriding questions to the fore. One is whether the AI chip boom will turn out to be as large and durable as the tech and financial worlds have come to believe. The other is whether Nvidia, which dominates this market, can withstand the onslaught of competitio­n that has been unleashed against it, including from some of its biggest customers.

On the first question, chip industry luminaries have been outdoing each other in recent weeks in their prediction­s for the demand that generative AI would bring.

Lisa Su, chief executive of chipmaker AMD, one of Nvidia’s most serious challenger­s, predicted late last year that the market for AI chips would reach $400bn annually by 2027 — more than double her previous estimate, and as much as the entire global chip market was worth as recently as 2019.

Nvidia chief executive Jensen Huang, never one to be upstaged when it comes to expansive prediction­s, has said for some time that $1tn worth of equipment in the world’s data centres needs to be overhauled to lay the ground for the new AI era. In recent weeks he has upped the ante, with a new prediction that the total value of all the gear in data centres will rise to $2tn in the next four or five years.

At almost the same moment this week that Nvidia revealed its earnings, Sam Altman, chief executive of OpenAI, whose ChatGPT system kick-started the AI boom, was adding his voice to the growing chorus calling for massive extra investment in all types of AI infrastruc­ture, including chips. “I think everybody is underestim­ating the need for AI [computing resources],” Altman said at an Intel event. He is seeking deep-pocketed investors for his own venture to create AI chips.

At this early stage in the AI bonanza, the forecasts amount to little more than stabs in the dark as to the market’s ultimate size, according to Jim Tierney, a growth stock investor at AllianceBe­rnstein. But the news from Nvidia this week helped to reassure Wall Street that demand looked set to remain strong at least throughout this year and well into 2025.

Nvidia’s forecast for the current quarter was well above expectatio­ns. It does not offer longerterm prediction­s, but the company’s executives said the supply of new products due out this year would be tight, even as demand for existing chips remained stretched — an apparent indication that supply was unlikely to catch up with demand this year.

In another sign that demand for Nvidia’s chips may be more durable than some had feared, Huang revealed that 40 per cent of the company’s data centre revenue in the latest quarter came from AI inference — that is, applying AI models to solve problems, rather than from the model training that has been the main source of Nvidia’s AI dominance.

This appeared to answer a perennial concern about the company, that its expensive chips would be less in demand for inference, a market that in the past has required less powerful processors. It also lessened worries about the impact on Nvidia from a potential slowdown in 2025 in AI training.

Yet the durability of the AI infrastruc­ture still depends heavily on whether the ultimate customers for the technology — businesses that hope to use it to boost their revenues or increase their efficiency — get value for money.

Most companies have barely begun testing generative AI in their businesses or worked out how to deal with the particular problems the technology presents, such as its tendency to “hallucinat­e”, or produce inaccurate results. In the words of Stacy Rasgon, a chip analyst at Bernstein: “What if there’s no return on these assets?”

The tech world has had ample experience of what happens when expectatio­ns for a new technology outrun reality. In the early 2000s, Cisco became briefly the world’s most valuable company as demand surged for its fibre optic equipment needed to handle internet traffic.

Yet the technology initially failed to produce the anticipate­d boom in business, leading to bankruptci­es among telecoms companies and a collapse of nearly 90 per cent in Cisco’s share price.

Early in the generative AI boom, there are at least some signs that the course of tech history will be different this time. “The fibre was put in the ground [in the 1990s] before it was needed. That’s not the case with AI chips, they aren’t being put in a warehouse somewhere,” says Rasgon.

Instead, demand for chips to train and run large language models is running well ahead of supply, with many developers struggling to get access to the hardware they need.

Stock market valuations also look far less stretched than they did in the early days of the internet. At the peak of the internet boom, Cisco’s shares traded at about 200 times its earnings. By contrast, Nvidia’s shares stood at around 32 times next year’s expected earnings before it announced its results this week — a multiple that has remained broadly unchanged, as both its share price and many analysts’ earnings forecasts jumped in tandem.

At that level, Nvidia’s shares are well within their historic valuation range. Yet that may provide little protection from a severe valuation reset if the AI infrastruc­ture boom dries up — or if competitor­s begin to catch up to Nvidia.

Challenger­s are beginning to emerge. The three big cloud companies — Microsoft, Amazon and Google — want to go from being Nvidia customers to competitor­s; all three have designed their own chips.

At the same time, chip industry rivals are belatedly starting to catch up with the performanc­e of Nvidia’s most advanced chips. AMD has already released a next-generation chip generally agreed to outperform Nvidia’s equivalent.

The market is hungry for more competitio­n, if only to put a cap on the premium prices Nvidia charges — a cushion that enabled it to boost its gross profit margin from 57 per cent to 73 per cent last year.

“The [giant cloud companies] want supply options,” says Moorhead, the chip analyst. “They’re going to throw some business to AMD and some to Intel. They want a three-horse race.”

Yet for now, with the AI boom in full swing, Nvidia’s position looks as robust as ever. AMD, for instance, has set a target of only $3.5bn of AI chips for this year. Even if it blows past that figure, says Rasgon, its AI business would still be a “rounding error” compared with Nvidia’s sales to data centre customers, which Wall Street expects to approach $100bn.

The in-house chips made by the big cloud companies, meanwhile, may cut into Nvidia’s business as they take on more of the work of training the companies’ own AI models. But they may have little broader impact, at least in the short term.

Many of the cloud companies’ customers continue to demand access to Nvidia’s chips. The many years of work that have gone into creating the tools and frameworks that make it easier to program Nvidia’s chips for specific tasks have created an inertia that will be hard to break.

Even if rivals manage to match the performanc­e of some of Nvidia’s individual offerings, none has anything close to the range of chips, systems and software tools that Nvidia has built up. Its overall set of technologi­es amounts to a powerful platform for AI that many customers will hesitate to switch away from, says Moorhead.

So for 2024, at least, it looks like Nvidia will remain on top. But given the scale of the potential market for AI, as well as the wealth and ambition of the tech giants seeking to take their cut of it, the company will have to work hard to stay there.

 ?? Photo by JOEL SAGET / AFP ?? This photograph taken in Paris on February 23, 2024 shows an Nvidia graphic processing unit (GPU).
Photo by JOEL SAGET / AFP This photograph taken in Paris on February 23, 2024 shows an Nvidia graphic processing unit (GPU).
 ?? ??

Newspapers in English

Newspapers from Philippines