DGX Station

nvidia_dgx_station

DGX Station

NVIDIA DGX Station A100 is delivering 2.5petaFLOPS of AI performance and is designed to run any high compute workload simultaneously, including training, inference, and data analytics. It offers data center technology without additional IT infrastructure, complicated installation, data center power and cooling. Simple plug-and-go approach.

DGX Station A100 is world’s only workstation with Multi-Instance GPU capability that offers multi-user experience and can provide up to 28 separate GPU instances to individual users. Four fully interconnected Nvidia A100 Tensor Core GPUs with up to 320GB of total GPU memory offers Data Center performance and can plug into a standard power outlet and run in minutes.

SYSTEM SPECIFICATIONS

NVIDIA DGX Station A100 320GB NVIDIA DGX Station A100 160GB
GPU Memory 320 GB total 160 GB total
GPUs 4x NVIDIA A100 80 GB GPUs 4x NVIDIA A100 40 GB GPUs
Performance 2.5 petaFLOPS AI 5 petaOPS INT8
System Power Usage 1.5 kW at 100–120 Vac
CPU Single AMD 7742, 64 cores, 2.25 GHz (base)–3.4 GHz (max boost)
System Memory 512 GB DDR4
Networking Dual-port 10Gbase-T Ethernet LAN
Single-port 1Gbase-T Ethernet BMC management port
Storage OS: 1x 1.92 TB NVME drive
Internal storage: 7.68 TB U.2 NVME drive
Display 4 GB GPU memory, 4x Mini DisplayPort
Acoustics <37 dB
Software Ubuntu Linux OS
System Weight 91.0 lbs (43.1 kgs)
System Dimensions Height: 25.1 in (639 mm)
Width: 10.1 in (256 mm)
Length: 20.4 in (518 mm)
Operating Temperature Range 5–35 ºC (41–95 ºF)