Nvidia is working on a new variant of the A100 GPU, with 80GB of HBM2e memory and a PCIe 4.0 interface. Currently, PCIe variant of this accelerator It only comes with 40 GB video memory.
Recently launched A100 PCIe with 80GB HBM2e memory Nvidia data center website, without an official announcement. nvidia made earlier An 80GB variant of the A100 GPU, but it was delivered only as an SXM4 unit, which is mounted directly on the motherboard. The company had not previously offered the 80GB A100 as a PCIe interface card.
Nvidia has yet to announce a release date for the new PCIe variant, but anonymous sources say so Video Cards The GPU should be available next week. The new 80GB variant will have 2TB/s memory bandwidth, just like the 80GB SXM4 module.
The Nvidia A100 is a data center GPU based on the Ampere architecture, which is also used in the company’s GeForce RTX 30 graphics cards. The chip has a surface area of 826 mm² and consists of 54 billion transistors. The A100 features 6,912 CUDA cores. The structure of those nuclei Although different of CUDA cores from Nvidia’s recent GeForce consumer graphics cards, making these core numbers uncomparable to the company’s RTX 30 graphics cards.
|Nvidia A100 Specifications|
|Form||A100 PCIe||A100 SXM4|
|memory||40 GB / 80 GB||40 GB / 80 GB|
|Memory bandwidth||40 GB: 1555 GB/sec
80 GB: 2039 GB / s?
|40 GB: 1555 GB/sec
80 GB: 2039 GB/sec
|the above. tdp||40 GB: 250 W
80 GB: nnb
|40 GB: 400 Watt
80 GB: 400 Watt