Demand is outstripping supply for Nvidia graphics cards, largely due to the success of ChatGPT, which is encouraging tech giants to invest in AI accelerators, whether for ChatGPT or similar tools.
ChatGPT seems to be changing the market, and Nvidia is one of the main beneficiaries
ChatGPT and other voice, image, and video generation tools rely heavily on AI processing power, and Nvidia is a major player with its graphics cards to accelerate such tasks. Growing demand will impact GPU chip production capacity, which would also affect the ability to manufacture GeForce GPUs for the gaming market.
According to FierceElectronics, ChatGPT (beta version of Open.AI) has been trained on 10,000 NVIDIA GPUs, but since going public, demand is increasing and requires more processing power. Recently, ChatGPT announced its $20 per month Plus plan to ensure they have an always available service, even at peak times. It currently has 25,000 GPUs deployed.
“It is possible that ChatGPT or other deep learning models may be trained or run on third-party GPUs in the future. Today, however, Nvidia GPUs are widely used in the deep learning community because of their high performance and support for CUDA. CUDA is a parallel computing platform and programming model that enables efficient computations on Nvidia GPUs. Many deep learning libraries and frameworks such as TensorFlow and PyTorch are CUDA compatible and optimized for these GPUs.
We know that other vendors like Microsoft and Google are also planning to integrate a similar technology like ChatGPT into their search engines, so the demand for graphics accelerators will increase in 2023.
Forbes estimates that Google would need 512,820 A100 HGX servers with a total of 4,102,568 A100 GPUs to integrate this technology into all search queries. That would cost Google an investment of $100 billion.
Nvidia’s coffers will increase exponentially as a result, although this would also affect its ability to produce GPUs for games, as the real business for the green team is not there, but in GPUs for AI, such as A100 or H100. We will keep you updated on this.