April 19, 2025
Trending News

Nvidia is working on an AI GPU with 144GB of HBM3E memory

  • August 7, 2024
  • 0

According to TrendForce, Nvidia is working on the next-generation B100 and B200 GPUs based on the Blackwell architecture. The new GPUs are expected to be available in the


According to TrendForce, Nvidia is working on the next-generation B100 and B200 GPUs based on the Blackwell architecture. The new GPUs are expected to be available in the second half of this year and will be used by CSP cloud customers. For those who don’t know, “cloud CSP customers” refer to customers who use cloud service providers (CSPs) for their cloud computing needs. The company will also add an optimized version of the B200A for enterprise OEM customers who need advanced AI.


TSMC’s CoWoS-L package (used by the B200 series) reportedly remains limited in capacity. The B200A is said to use the relatively simple CoWoS-S packaging technology. The company is focusing on the B200A to meet the needs of cloud service providers.

Technical specifications of B200A:

Unfortunately, the exact specifications of the B200A are not yet known. For now, we can only confirm that the storage capacity of the HBM3E has been reduced from 192GB to 144GB. The number of memory chip layers has also reportedly been halved from eight to four. All this means that the capacity of a single chip has been increased from 24GB to 36GB.

The B200A will have lower power consumption than the B200 GPU and will not require liquid cooling. It will also make it easier to install air cooling systems for the new GPUs. The B200A is expected to be delivered to OEMs in the second quarter of next year.

Supply chain research suggests that major shipments of NVIDIA’s high-end GPUs in 2024 will be based on the Hopper platform, with the H100 and H200 for the North American market and the H20 for the Chinese market. The B200A is not expected to interfere with the H200, which is set to ship in Q2 2025 or later.

Source: Port Altele

Leave a Reply

Your email address will not be published. Required fields are marked *

Exit mobile version