‘The world’s most valuable chipmaker’: How Nvidia saw a 250% surge in revenues amid AI stock rally | Explained News
‘The world’s most valuable chipmaker’: How Nvidia saw a 250% surge in revenues amid AI stock rally | Explained News
Shares of Nvidia Corporation, the undisputed king of advanced chips that are driving artificial intelligence (AI) applications, surged the most in about nine months after the American chipmaker delivered yet another bumper earnings forecast.
Nasdaq-listed Nvidia’s stock surge added fresh momentum to what is being described as a ‘FOMO’ AI stock rally, which has already propelled the Santa Clara-based chipmaker to the title of the world’s most valuable chipmaker, eclipsing storied competitors such as Intel and AMD.
Nvidia’s revenues during the fourth quarter of 2023 topped $22 billion, an over 250% surge compared to the previous year, higher than analysts’ consensus forecast of around $20 billion. Its gross margins – a key profitability metric – surged to well over 700% during the quarter.
“Accelerated computing and generative AI have hit the tipping point,” Chief Executive Officer Jensen Huang said in the statement. “Demand is surging worldwide across companies, industries and nations.”
According to Bloomberg data, Nvidia’s market capitalization has now increased by more than $600 billion this year — bringing its valuation to nearly $1.90 trillion. Before the results announcement, analysts at Goldman Sachs had pegged Nvidia as “the most important stock on planet Earth”, a reflection of the fact the chipmaker currently controls over 80% of the AI chip market and is likely to remain in a dominant position well into the future.
The Nvidia rally seems to have lifted many boats, with shares of American ‘voice AI’ startup SoundHound and British chip designer Arm, companies which Nvidia recently said it had invested in, both surging nearly 5 per cent in Thursday’s (February 22) trade.
Competitor AMD also surged over 10 per cent while AI-linked stock Palantir went up over 2.5 per cent. Besides AMD, Broadcom and Marvell Tech, the other chipmakers in the AI race, too surged in what is being described by analysts as an AI FOMO, or ‘fear-of-missing-out’, rally.
GPU wave
Nvidia has a stranglehold over highly-prized chips called graphics processing units, or GPUs, which crunch data for AI models. There are two reasons for the shortage: One, the generative AI boom has led to the demand for these specialised chips to skyrocket, given that GPUs have the computing power and operational efficiency to run the calculations that allow AI companies working on LLMs (or large language models), such as ChatGPT or Bard, to chomp down on massive volumes of data.
Two, Nvidia’s virtual monopoly over GPUs has meant that the chipmaker is now swamped with orders that it is struggling to deliver.
Moody’s Investors Service, in its just published 2024 AI outlook, has said “growing AI spending, model improvement and edge computing” will speed up AI adoption and that AI investment will rise as firms move “from exploration to deployment”. “A shortage of high-performance graphical processing units (GPUs), essential for most AI computing, will persist in 2024, but supply will improve gradually”, Moody’s noted.
Traditionally, the CPU, or central processing unit, has been the most important component in a computer or server, and Intel and AMD dominate the market. GPUs are relatively new additions to the computer hardware market and were initially sold as cards that plugged into a personal computer’s motherboard to add computing power to an AMD or Intel CPU.
Nvidia’s main pitch over the years has been that graphics chips can handle the computation workload surge of the kind that is needed in high-end graphics for gaming or animation applications far better than standard processors. AI applications too require tremendous computing power and have been progressively getting GPU-heavy in their backend hardware.
Most advanced systems used for training generative AI tools now deploy as many as half a dozen GPUs to every one CPU used, completely changing the equation in which GPUs were seen as add-ons to CPUs. Nvidia dominates the global market for GPUs and is likely to maintain this lead well into the foreseeable future.
AI wave, data centre demand
If Taiwan-based foundry specialist TSMC is the most important backend player in the semiconductor chips business, Nvidia (with Intel, AMD, Samsung and Qualcomm) is at the front end. Since 1999, when Nvidia first popularised the term GPU with its GeForce 256 processor, the company’s chips have been coveted as shaping what is possible in graphics. Nvidia GPU chips such as the new ‘RTX’ range are now at the forefront of the generative AI boom based on LLMs.
Nvidia’s data centre business has recorded a growth of over 10% during most of the last calendar year versus flat growth for AMD’s data centre unit and a dip in Intel’s data centre business unit. Alongside the application of GPUs, the company’s chips are comparatively more expensive than most CPUs on a per-unit basis, resulting in better margins.
According to Aravind Srinivas, founder and CEO of Perplexity AI, Nvidia is ahead in the race for AI chips because of its proprietary software that makes it easier to leverage all of the GPU hardware features for AI applications.
Nvidia also has the systems that back the processors up, and the software that runs all of it, making it a full stack solutions company (Incidentally, Nvidia has a stake in Perplexity, which is seen as a smaller, yet one of the most promising rivals that could take on OpenAI and Google).
In addition to GPU manufacturing, Nvidia offers an application programme interface (API) — a set of defined instructions that enable different applications to communicate with each other — called CUDA, which allows the creation of parallel programs using GPUs and are deployed in supercomputing sites around the world. It also has a leg in the mobile computing market with its Tegra mobile processors for smartphones and tablets, as well as products in vehicle navigation and entertainment systems.
Entry barriers
Nvidia’s resilience is a case study in a segment that has very high entry barriers and offers a phenomenal premium for specialisation. The global semiconductor chip industry is dominated by some countries and a handful of companies. Taiwan and South Korea make up about 80% of the global foundry base for chips.
Only one firm, the Netherlands-based ASML, produces EUV (extreme ultraviolet lithography) devices, without which it is not possible to make an advanced chip. Cambridge, UK-based chip designer Arm, where Nvidia is a stakeholder, is the world’s biggest supplier of chip design elements used in products from smartphones to gaming consoles (which Nvidia was keen to acquire).
It’s a nearly closed manufacturing ecosystem with very high entry barriers, as China’s SMIC, a national semiconductor champion that is now reportedly struggling to procure advanced chip-making equipment after a US-led blockade, is finding out. In this market, Nvidia, which comprehensively dominates the chips used for high-end graphics-based applications, has come to dominate multiple end-use sectors including gaming, crypto mining, and now AI.
Diversification and business risks
In an apparent bid to diversify risks and vertically integrate further, Nvidia said it is releasing a new tool that lets owners of its latest series of graphic cards run an AI-powered chatbot offline on a Windows PC. The tool, called ‘Chat with RTX’, will allow users to customise a generative AI model on the lines of OpenAI’s ChatGPT or Google’s Bard by linking it to files, documents, and notes that they can then query.
“Rather than searching through notes or saved content, users can simply type queries,” Nvidia said in a blog post last Tuesday.
Nvidia’s chatbot push comes at a time when OpenAI CEO Sam Altman is seeking trillions of dollars in investments to revamp the global semiconductor industry, The Wall Street Journal reported earlier this month. Altman, who’s behind the startup that launched ChatGPT, the fastest-growing consumer software application in history, has repeatedly voiced concern over the supply-and-demand problem with AI chips which, he has said, is limiting the growth of OpenAI.
Altman is now reportedly in talks with multiple investors for a project that could increase the global chip-building capacity, according to the WSJ report. Altman is not alone. SoftBank Group Chief Executive Officer Masayoshi Son is also looking to raise up to $100 billion for a chip venture that will rival, Bloomberg News reported on Friday, citing people with knowledge of the matter. The project, codenamed ‘Izanagi’, will likely supply semiconductors essential for AI, the report said.
These announcements are being seen as indicating the direction the generative AI discourse could take going forward. This also indicates a gradual blurring of the lines between the two most consequential players in the AI story (Nvidia and OpenAI), and the likely intersection of the interests of those who currently stand on opposite sides of the hardware-software divide.