Return to site

DGX SuperPOD With
DGX GB200 Systems

Introducing DGX SuperPOD �With DGX GB200 Systems

· NVIDIA

NVIDIA DGX SuperPOD™ with DGX GB200 systems is purpose-built for training and inferencing trillion-parameter generative AI models. Each liquid-cooled rack features 36 NVIDIA GB200 Grace Blackwell Superchips–36 NVIDIA Grace CPUs and 72 Blackwell
GPUs–connected as one with NVIDIA NVLink. Multiple racks connect with NVIDIA Quantum InfiniBand to scale up to tens of thousands of GB200 Superchips.

broken image


Highlyefficient, liquid-cooled, rack-scale design built with NVIDIA GB200 Grace
Hopper Superchips

36NVIDIA Grace CPUs and 72NVIDIA Blackwell GPUs per rack, connected via fifth-generation NVLink

Scaleto tens of thousands of GB200 Superchips with Quantum-2 InfiniBand

Intelligent,full-stack resilience for constant uptime

Integratedhardware and NVIDIA AI software

Built,cabled, and factory tested before delivery and installation

Optional 576 NVLinkconfiguration for memory-limited workloads

It is turnkey solutions providing end-to-end AI workload needed, from highperformance infrastructure to the MLOps for deployment.

broken image