Unlock the Editor’s Digest for free
Roula Khalaf, Editor of the FT, selects her favourite stories in this weekly newsletter.
Is there any way that a company can still be considered a bargain if its share price has risen more than five-fold in little more than a year, putting it on the verge of becoming the tech world’s third most-valuable tech concern?
The extraordinary rise of Nvidia has set new standards in hyperbole on Wall Street, as analysts have raced each other to repeatedly ratchet up their earnings estimates and share price forecasts — including, in the last week, Morgan Stanley and Bank of America. Its shares are up another 45 per cent, or $500bn, since just the start of this year, putting it close to eclipsing Amazon and Alphabet.
Yet even after the latest run-up, Nvidia may still be reasonably valued. With earnings estimates rising almost as fast as the stock, the shares now trade at about 30 times this year’s expected profits, still modest by its historic mean and decidedly cheap for a company in the midst of a two-year growth spurt expected to quadruple its revenue.
The massive swings in market sentiment around Nvidia are an object lesson in the difficulty of assessing the pay-off from a potentially massive new tech market — with the risk of underestimation being as big as that of overestimation.
It is far from clear yet, for instance, whether generative AI will be as transformative as its promoters claim.
Despite the viral success of ChatGPT, it does not look like it is becoming the sort of breakout consumer application capable of creating a massive new online market, like internet search or social networking. And in the business world, most companies are still at the earliest stages of assessing whether the technology will boost productivity or open up new markets. The giant wave of investment flooding into the field — and lifting Nvidia’s fortunes — is likely to dry up quickly if those returns don’t start to materialise this year.
Even if generative AI does live up to the hype, meanwhile, Nvidia may not be best-positioned for the pay-off. It has been in the sweet spot so far, thanks to the superior performance of its chips for training the large language models like OpenAI’s GPT-4. But it is still unclear what kind of silicon will be most in demand as the boom plays out. The need for continuous training and retraining of models may keep Nvidia’s chips in demand, but an explosion of smaller AI models, along with the work of applying AI to specific tasks, known as inferencing, could also see much of the spending shift to other types of chip.
An inevitable slowdown from the current torrid pace is already on the cards, with Wall Street expecting Nvidia’s revenue growth rate to fall back to below 10 per cent in 2025. Even if this turns out to be just a pause as data centre customers digest two years of huge investment, red-hot growth stocks rarely manage that kind of abrupt slowdown smoothly.
For all that, Nvidia’s position at the centre of the AI computing revolution is hard to overstate. This time last year, it looked like a company with the right product at the right time, thanks to its recently launched H100 chip for training large language models. Twelve months on, far from being a one-chip wonder, it has raced to expand its range of chips, while also developing a full systems and software capability to become a much broader data centre technology vendor, one purpose-built for the new demands of AI.
It will have to move fast. Its biggest customers — the “hyperscale” cloud companies such as Microsoft and Amazon — have been rushing to develop rival chips of their own. With a gross margin of more than 70 per cent, and data centre sales forecast to top $80bn this year, Nvidia was always likely to face some fierce competition.
Hardware performance is only part of the equation, though. After years building the software tools that developers need to use its graphics processing units for a wide range of data-intensive tasks, Nvidia has a strong following in the tech world that will not be shaken in a hurry.
Making the computing power of its chips available as a “supercomputer in the cloud” should further strengthen its ties to some of the biggest users of AI. Along with an expanding range of software, this should bring it a bigger slice of the overall AI pie, even as its share of silicon sales falls.
Like all tech stocks that have been through a period of torrid growth, there will be bumpy times ahead. But for now, with the world racing to build a new AI computing infrastructure, it is still hard to stand in the way of the Nvidia juggernaut on Wall Street.