A major development in AI data centers was announced by Wiwynn at the OCP Global Summit 2024.
- Wiwynn has introduced a cutting-edge suite of AI solutions featuring the NVIDIA GB200 Grace Blackwell Superchip.
- New energy-efficient cooling technologies, including dielectric direct-to-chip liquid cooling, were displayed.
- Wiwynn is showcasing a powerful AI system through partnerships and advanced technological integration.
- The innovations promise significant performance enhancement and energy savings for data centers globally.
Wiwynn has made a significant announcement at the OCP Global Summit 2024, introducing a comprehensive suite of AI data center solutions. Central to these innovations is the utilisation of an advanced NVIDIA GB200 Grace Blackwell Superchip, marking a new frontier in AI capabilities for hyperscale data centers.
At the summit, Wiwynn presented state-of-the-art liquid cooling technologies designed to enhance energy efficiency. The inclusion of dielectric direct-to-chip liquid cooling, capable of handling up to 2.5 kW thermal design power (TDP), represents a leap in sustainable data center operations.
Moreover, Wiwynn has developed a liquid-cooled, rack-level AI system in close collaboration with Wistron Corporation. This system includes the NVIDIA GB200 NVL72 platform, notably enhancing training and inference capabilities for complex AI models while significantly reducing total cost of ownership.
Wiwynn is making strides in AI acceleration by integrating NVIDIA GB200 NVL72 with 72 Blackwell GPUs and 36 Grace CPUs, facilitated by NVLink™ and NVLink Switch technologies. This setup promises robust AI performance within a single rack.
Furthermore, Wiwynn unveiled updated AI solutions based on the NVIDIA HGX™ platform, offering options like the GS1400A server and the compact GS1300N, both designed for optimal computing and heat dissipation through Wiwynn’s DLC technology.
These innovations by Wiwynn signal a transformative advance in AI and sustainability for data centers worldwide.