슈퍼마이크로가 컴퓨텍스 2024에서 엔비디아 AI 엔터프라이즈 플랫폼에 최적화된 수냉식 슈퍼클러스터 라인업을 공개했다.
▲Supermicro CEO Charles Liang shares the keynote stage with NVIDIA CEO Jensen Huang at Computex 2024.
Blackwell-based water-cooled AI supercluster unveiled
Supermicro unveiled its lineup of liquid-cooled Superclusters optimized for NVIDIA AI enterprise platforms at Computex 2024.
In his keynote, Charles Liang, founder and CEO of Supermicro, introduced the liquid-cooled SuperCluster lineup and highlighted the bonuses that liquid cooling will provide.
At Computex 2024, Supermicro announced upcoming NVIDIA Blackwell GPU-based systems, including 10U air-cooled and 4U liquid-cooled NVIDIA HGX B200-based systems.
Supermicro will offer an 8U air-cooled NVIDIA HGX B100 system with 72 GPUs interconnected with NVIDIA NVLink switches, as well as the NVIDIA GB200 NVL72 rack with NVIDIA H200 NVL PCIe GPUs and the newly announced MGXTM systems.
Currently, Supermicro-built AI Supercluster products include three types: △4U NVIDIA HGX H100/H200 liquid-cooled Supercluster, △8U air-cooled NVIDIA HGX H100/H200 Supercluster, and △8U NVIDIA MGX GH200 Supercluster.
In addition, Supermicro will be releasing its next-generation supercluster products in the future.The products have been unveiled. △Water-cooled NVIDIA HGX B200 Supercluster △Air-cooled NVIDIA HGX B100/B200 Supercluster △Water-cooled NVIDIA GB200 NVL72 or NVL36 3 types have been named as new lineups.
Supermicro described its SuperCluster solution as optimized for LLM learning, deep learning, high throughput, and batch size inference.
With the growing need to simplify AI infrastructure and provide cost-effective accessibility, Supermicro emphasized that its cloud-based AI superclusters help bridge the gap between accessibility and portability, enabling AI projects to seamlessly move from pilot to production.
As the industry rapidly experiments with generative AI use cases, Supermicro explained that its collaboration with NVIDIA demonstrates a flexible transition from AI application pilots to production deployments and large-scale data center AI.