Nvidia to sell 550,000 H100 GPUs for AI by 2023,

Nvidia to sell 550,000 H100 GPUs for AI by 2023

Sales of Nvidia H100 GPUs to accelerate artificial intelligence tasks are completely insane in the midst of 2023.

Nvidia to sell 550,000 H100 GPUs for AI by 2023

, Nvidia to sell 550,000 H100 GPUs for AI by 2023,

According to a report by Financial Times, Nvidia could reach the figure of 550,000 H100 accelerators sold over the course of 2023. Most of those buyers come from American companies, which want to jump on the generative AI bandwagon, with servers for artificial intelligence and high-performance computing HPC.

Financial Times sources contacts at TSMC and Nvidia itself, who claim that the green company is easily going to surpass half a million H100 units, which is its most powerful accelerator card for AI and HPC tasks.

The Nvidia H100 GPU features a total of 80GB of HBM2e memory with 14,592 CUDA cores silicon, which can deliver up to 26 TFLOPS FP64 and 1,513 TFLOPS FP16. However, it is not the only model out there, we also have the H100 SXM 80GB HBM3, which includes 16,896 CUDA cores, 34 TFLOPS FP64, and 1,979 TFLOPS FP16. However, this model is only sold to server manufacturers such as Foxconn and Quanta, or supplied inside servers that Nvidia sells directly.

, Nvidia to sell 550,000 H100 GPUs for AI by 2023,

Nvidia is set to continue gaining ground in artificial intelligence, which is why it is about to launch the GH200 Grace Hopper GH200s, consisting of the 72-core Grace CPU and an 80 GB HBM3E H100 compute GPU.

Each unit of the H100 sells for about $30,000 in the US, so Nvidia is making an absolute fortune. That fortune is difficult to estimate since Nvidia does not give prices for the H100 SXM, H100 NVL, and GH200 Grace Hopper, but if we just take the H100 figure, the revenue for Nvidia would be $16.5 billion. Remember that Nvidia also continues to sell other accelerators, such as the A100, in addition to the exclusive models for China, such as the A800 and H800, so the revenue figure in this segment should be much higher. We will keep you posted.