ChatGPT's growing demand for GPUs is going to directly benefit NVIDIA which is one of the largest players in the AI market, reports Trendforce.
Previous estimates had put the number of GPUs powering the GPT model around 10-20K but with the new model, OpenAI is expected to utilize way more than that. According to the report, 30,000 is the number that is expected for the latest model that is going to power ChatGPT but it looks like we might see even more demand.
In the case of the Generative Pre-Trained Transformer (GPT) that underlays ChatGPT, the number of training parameters used in the development of this autoregressive language model rose from around 120 million in 2018 to almost 180 billion in 2020. According to TrendForce’s estimation, the number of GPUs that the GPT model needed to process training data in 2020 came to around 20,000. Going forward, the number of GPUs that will be needed for the commercialization of the GPT model (or ChatGPT) is projected to reach above 30,000.
Note that these estimations use NVIDIA’s A100 as the basis for calculations. Hence, with generative AI becoming a trend, demand is expected to rise significantly for GPUs and thereby benefit the participants in the related supply chain. NVIDIA, for instance, will probably gain the most from the development of generative AI. Its DGX A100, which is a universal system for AI-related workloads, delivers 5 petaFLOPS and has nearly become the top choice for big data analysis and AI acceleration.
via Trendforce
The research firm notes that the demand for AI GPUs is expected to reach beyond 30,000 and that estimation uses the A100 GPU which is one of the fastest AI chips around with up to 5 Petaflops of AI performance. That number can go higher or lower
Read more on wccftech.com