One of the biggest GPU companies in the world, Nvidia, has made some announcements at the Computex show in Taipei. The company is looking forward to further enhancing the performance of its products to deliver more AI horsepower for data intelligence. Paresh Kharya, senior director of product management at Nvidia also underlined the current trend of AI infusion into every customer engagement and added that data centers are transforming into AI factories.
144 ARM V9 cores
For AI factories, Nvidia is aiming to deliver the Grace chip which was first announced the last year. The Grace chip is a CPU (not GPU) that includes up to 144 ARM V9 cores when used as two-chip architecture that is connected with NVLink interconnect technology. Another Grace setup will be combining a Grace CPU with an Nvidia Hopper GPU. Those chips are going to be available in the first half of 2023.
Other than the upcoming Grace chips, Nvidia has given some details about the two-rack-unit (2U) server architecture. CGX is one of the designs for 2U architecture and it focuses on cloud graphics and gaming. CGX includes Grace chip, Nvidia a16 GPUs, and BlueField 3 data processing units. The other design named OVX is focusing on AI digital twins and Omniverse workloads. It utilizes the same hardware as CGX except for the option for selecting the suitable GPUs instead of Nvidia a16s. HGX Grace and HGX Grace Hopper 2U will also focus on AI training and interference processes.
The company will deliver liquid-cooled GPU products as well. In the third quarter of this year, Nvidia will deliver A100 GPUs with direct-to-chip liquid cooling solutions.