CONNECT WITH US

Chenbro launches Nvidia MGX server chassis solutions for empowering AI and data center development

News highlights 0

With the rapid growth of AI, high-performance computing (HPC), and cloud computing, data centers face increasing challenges regarding performance demands, deployment flexibility, and energy efficiency. To address these issues, NVIDIA has introduced the MGX modular server architecture, which offers unparalleled flexibility and scalability, providing a revolutionary solution for modern data centers and redefining the future of computing.

The MGX modular server architecture, developed by NVIDIA, is designed to accelerate AI and HPC advancements. Centered on GPUs, MGX adopts standardized hardware specifications and flexible configurations. It supports NVIDIA Grace CPUs, x86 architecture, and the latest GPUs, such as the H200 and B200, while integrating advanced networking capabilities with BlueField DPUs. MGX is compatible with 19-inch standard racks, supporting 1U, 2U, and 4U servers, and is further enhanced with compatibility for OCP standards. Beyond supporting multiple generations of GPUs, CPUs, and DPUs, MGX's modular pool includes I/O modules, PCIe cards, add-in cards, and storage modules, enabling over 160 system configurations to meet diverse data center needs, shorten development cycles and reduce costs.

The modular and highly flexible design of the MGX architecture helps enterprises deploy HPC solutions swiftly, offering diverse scenario-based solutions for cloud service providers (CSPs) and enterprise data centers. First, in AI and generative AI, MGX excels in deep learning, leveraging modular design and multi-GPU parallel computing to accelerate AI model training. It supports applications such as speech recognition, image processing, and large language models (LLMs). For high-performance computing, MGX's multi-GPU parallel processing capabilities empower scientific research, deep learning training, and autonomous driving applications, serving as a significant breakthrough in the computing revolution. Lastly, MGX's modular design enables CSPs to deploy cloud servers rapidly in cloud and edge computing applications. Supporting multi-generation CPUs, GPUs, and DPUs, it delivers comprehensive solutions for cloud services. Moreover, MGX's high performance and compact design for edge computing meet low-latency demands, such as real-time image analysis and decision-making in smart cities and autonomous driving applications.

As an NVIDIA partner, Chenbro is dedicated to promoting the adoption and expansion of the MGX architecture by offering versatile server chassis solutions. Chenbro collaborates with system integrators and server brands to develop customized solutions, ranging from open chassis, to JDM/ODM and OEM plus services. Whether for standardized deployments or fully customized server designs, Chenbro ensures each customer receives solutions tailored to market demands, supporting diverse deployments in AI, HPC, and big data.

1U/2U Compute Trays Supporting GB200 NVL72/NVL36 Liquid-Cooled Racks (Single Rack Version)

Chenbro provides 1U and 2U Compute Trays compatible with GB200 NVL72 and NVL36 server racks, a high-density server solution designed by NVIDIA. The 1U configuration supports two compute boards per tray, each with two Blackwell GPUs and one Grace CPU, collectively known as the GB200 Superchip. The MGX standard rack houses 18 compute trays, offering 36 Grace CPUs and 72 Blackwell GPUs. For the 2U configuration, nine compute trays per rack combine 18 Grace CPUs and 36 Blackwell GPUs.

The GB200 NVL72 and NVL36 utilizes a "Blind Mate Liquid Cooling Manifold Design" for efficient cooling, ensuring stable operation under prolonged high workloads. Additionally, NVLink technology achieves data transfer speeds of up to 1,800 GB/s, significantly enhancing data processing efficiency. This makes it ideal for AI training, cloud computing, and large-scale data processing scenarios, providing robust support for computationally intensive applications such as speech recognition, natural language processing (NLP), and AI inference. With its modular design and exceptional performance density, this rack solution helps enterprises establish the next generation of AI factories.

2U MGX Chassis: Flexible Configuration and Future-Proof Compatibility for Enterprise-grade Server Chassis Solutions

Focused on AI server applications, Chenbro's 2U enterprise-grade server chassis solutions are built on the MGX system. Their modular design and future-proof expandability support GPU, DPU, and CPU upgrades, allowing subsystem reuse across various applications and generations. Designed to fit standard EIA 19-inch racks, the 2U MGX chassis supports traditional PCIe GPU cards, with configurations accommodating up to four GPU cards and air-cooled solutions. With modular bays of varying sizes, the chassis enables users to customize accelerated computing servers to meet specific application needs.

With MGX's flexibility and scalability, Chenbro collaborates with system integrators and server brands to develop customized solutions for AI training, large dataset processing, and other high-performance applications. These tailored AI server solutions meet diverse cross-industry AI requirements.

4U MGX Air-Cooled and Liquid-Cooled Chassis Solutions to Meet Future Enterprise Data Center Needs

Developed in collaboration with NVIDIA, Chenbro's 4U MGX air-cooled chassis solution is designed for AI training and HPC applications. It supports up to eight double-width GPGPU or NVIDIA H200 GPUs, and features five front-mounted and five mid-mounted 80x80 fans brackets for efficient cooling. The air-cooled 4U MGX is compatible with standard EIA 19-inch racks.

The 4U MGX liquid-cooled chassis, on the other hand, supports up to 16 liquid-cooled single-slot GPUs. Utilizing liquid cooling manifold technology, it distributes coolant efficiently to GPU assemblies, motherboards, and switchboards, ensuring superior thermal management and energy efficiency for high-density, long-duration operations. The liquid-cooled 4U MGX chassis must operate within MGX standard racks. Its design suits enterprise scenarios requiring scientific computation, big data processing, AI training, and HPC.

Unlike the high-density GPU solutions by NVL72 and NVL36 racks, the 4U MGX adopts a traditional Intel and AMD x86 CPU architecture, targeting enterprise users rather than large CSP scenarios. Both air-cooled and liquid-cooled 4U MGX solutions provide system integrators with greater flexibility to design proprietary MGX-compliant motherboards. Chenbro collaborates with clients to offer tailored server chassis solutions, meeting enterprise demands with flexible and efficient server solutions.

NVIDIA MGX Partner: Chenbro Driving the Evolution of AI and Data Centers

The NVIDIA MGX modular server architecture, with its exceptional flexibility, performance, and broad applicability, has become a pivotal milestone in the evolution of data center technology. As a partner, Chenbro actively engages in chassis design and production to promote the adoption and realization of this architecture, offering faster and more flexible server solutions. From GB200 NVL72/NVL36-compatible 1U/2U Compute Trays to 2U and 4U MGX chassis solutions, Chenbro's standard products and customized solutions not only meet current market demands but also lay a solid foundation for future AI server applications.

Looking ahead, the MGX architecture will continue to lead technological advancements in data center technology. With the rapid development of AI, 5G, and edge computing technologies, its application range will expand further, driving diverse data center solutions. Chenbro is building ecosystems around these architectures and will continue collaborating with NVIDIA and global clients to introduce next-generation AI server products. For more information about MGX products, please get in touch with Chenbro's sales representatives or visit the official website.

NVIDIA MGX Partner: Chenbro Driving the Evolution of AI and Data Centers

NVIDIA MGX Partner: Chenbro Driving the Evolution of AI and Data Centers

Chenbro Launches NVIDIA MGX Server Chassis Solutions for Empowering AI and Data Center Development

Chenbro Launches NVIDIA MGX Server Chassis Solutions for Empowering AI and Data Center Development