Home
News
Products
Corporate
Contact
 
Saturday, November 23, 2024

News
Industry News
Publications
CST News
Help/Support
Software
Tester FAQs
Industry News

ChatGPT to need More Than 30,000 Nvidia GPU


Friday, March 3, 2023

Artificial Intelligence (AI) will be one of Nvidia's biggest income generators, according to the latest TrendForce(opens in new tab) projection. The research firm estimates that OpenAI's ChatGPT will eventually need over 30,000 Nvidia graphics cards. Thankfully, gamers have nothing to be concerned about, as ChatGPT won't touch the best graphics cards for gaming but rather tap into Nvidia's compute accelerators, such as the A100.

Nvidia has always had a knack for sniffing out gold rushes. The chipmaker was at the forefront of the cryptocurrency boom, pulling in record-breaking revenues from miners. Nvidia once again finds itself on the front lines for what appears to be the next best thing: AI. And the AI boom is already HERE, as exemplified by all the AI-powered text and image creators that have emerged throughout the last several months.

Using the A100 (Ampere) accelerator for reference, TrendForce gauged that ChatGPT required around 20,000 units to process training data. However, the number will increase significantly, potentially over 30,000 units, as OpenAI continues to deploy ChatGPT and the company's Generative Pre-Trained Transformer (GPT) model commercially. The A100 costs between $10,000 and $15,000, depending upon the configuration and form factor. Therefore, at the very least, Nvidia is looking at $300 million in revenue. The number may be slightly lower, since Nvidia will likely give OpenAI a discount based on the number of cards needed.

Nvidia also sells the A100 as part of the DGX A100 system, which has eight accelerators and sells for a whopping $199,000. Given the scale of OpenAI's operation, the company will likely purchase the A100 individually and stack them into clusters. The DGX A100, on the other hand, is an attractive option for smaller businesses that want to dip their toes in AI.

While the A100 is excellent for AI, Nvidia has already started shipping the H100 (Hopper), the direct replacement for the A100. On paper, the H100 delivers up to three times higher performance than its predecessor. Furthermore, according to Nvidia, the H100 scales even better than the A100 and offers up to nine times higher throughput in AI training. The H100 has a significantly higher price tag, though, as listings have shown that the Hopper accelerator costs over $32,000.

Nvidia's latest earning report revealed that the company's data center business, including AI accelerators, improved by 11% compared to last year and raked in over $3.6 billion in sales during the quarter. Those numbers will likely skyrocket soon when big players like Microsoft get into the game. Microsoft is in the process of integrating ChatGPT into Bing and Edge. Considering the size of the user base (basically everyone running Windows), Microsoft may have to spend billions to scale in the coming months and years.

Nvidia isn't the only option on the AI market, as Intel and AMD also offer rival AI accelerators. Then you also have companies like Google and Amazon, with their own AI solutions. During the cryptocurrency bonanza, miners bought every graphics card in sight, helping to contribute to the graphics card shortage. We don't expect another shortage, but GeForce gaming graphics card supply could be affected if Nvidia suddenly decides to prioritize AI accelerator production over its mainstream offerings. Only time will tell.

By: DocMemory
Copyright © 2023 CST, Inc. All Rights Reserved

CST Inc. Memory Tester DDR Tester
Copyright © 1994 - 2023 CST, Inc. All Rights Reserved