Nvidia The stock surged toward a market cap of nearly $1 trillion in after-hours trading on Wednesday Report A shockingly strong outlook and chief executive Jensen Huang said the company was on track for a “record year.”
Sales rose due to a surge in demand for graphics processing units (GPUs) made by Nvidia that power artificial intelligence applications from companies including Google, Microsoft and OpenAI.
Demand for artificial intelligence chips in data centers helped Nvidia achieve sales of $11 billion in the quarter, beating analysts’ forecast of $7.15 billion.
“The tipping point is generative AI,” Huang said in an interview with CNBC. “We knew CPU scaling had slowed down, we knew accelerated computing was the way forward, and then the killer app came along.”
Huang said Nvidia believes it is making a clear shift in the way it makes computers that could lead to even more growth — components for data centers could even become a $1 trillion market.
Historically, the most important component in a computer or server was the central processing unit, or CPU, a market dominated by Inteland supermicro as its main competitor.
With the advent of artificial intelligence applications, the need for a lot of When it comes to computing power, graphics processing units (GPUs) take center stage, with the most advanced systems using as many as eight GPUs for a single CPU.Nvidia currently dominate AI GPU market.
“The data centers of the past were mainly CPUs for file retrieval, and the future will be generating data,” Huang said. “Instead of retrieving data, you’re going to retrieve some data, but you have to use artificial intelligence to generate most of it.”
“So instead of millions of CPUs, you’ll have fewer CPUs, but they’ll be connected to millions of GPUs,” Huang continued.
For example, NVIDIA’s Own DGX systemessentially an AI computer for training, uses eight of Nvidia’s high-end H100 GPUs and has only two CPUs.
Google’s A3 supercomputer Pair eight H100 GPUs with a high-end Xeon processor made by Intel.
That’s why Nvidia’s data center business grew 14% in the first quarter, while AMD’s data center business grew flat and Intel’s AI and data center business unit declined 39%.
Also, Nvidia’s GPUs tend to be more expensive than many CPUs.Intel’s latest Xeon CPU could cost as much as $17,000 by list price. A single Nvidia H100 can sell for $40,000 Secondary market.
As the market for artificial intelligence chips heats up, Nvidia will face tougher competition. AMD has a very competitive GPU business, especially in gaming, and Intel has its own line of GPUs.Startups are developing new chips specifically for AI, while companies focused on mobile devices such as Qualcomm Apple keeps pushing the technology so that one day it can run in your pocket, not in a giant server farm. Google and Amazon are designing their own AI chips.
But Nvidia’s high-end GPUs are still selected chip For companies currently building applications such as ChatGPT, these applications are expensive to train by crunching terabytes of data, and also expensive to run later in a process called “inference,” which uses the model to generate text, images or make predictions.
Nvidia remains ahead in AI chips as analysts say its proprietary software This makes it easier to use all GPU hardware capabilities for AI applications.
Huang said on Wednesday that the company’s software cannot be easily replicated.
“You have to design all the software, all the libraries and all the algorithms, integrate them into the framework and optimize them, and optimize them for the architecture, not just a chip, but the architecture of the entire data center,” Huang said in an interview with analysts said on a conference call.