As AI evolves at record speed, the need for high-performance data centres is surging. But keeping these massive systems cool and efficient has become a growing challenge. Enter Nvidia, which is transforming data centre infrastructure with its latest liquid cooling technology—a solution that saves both money and water.
Nvidia’s GB200 and GB300 NVL72 systems, revealed at GTC 2025, offer a smarter way to handle the massive heat generated by modern AI workloads. In fact, the company says these liquid-cooled setups are 300 times more water-efficient than traditional air cooling. That’s a game-changer for hyperscale data centres, which are now dealing with power demands as high as 135 kilowatts (kW) per rack—up from just 20 kW a few years ago.
For data centres running at 50 megawatts (MW), Nvidia claims its liquid cooling solution can reduce costs by up to $4 million per year. These savings come not just from lower energy use, but also from reduced dependence on mechanical chillers and less water waste.
How Nvidia’s Liquid Cooling Changes the Game
To keep AI systems running, data centres must manage enormous amounts of heat. Traditional cooling methods are starting to fall short, especially as compute density rises. Nvidia’s direct-to-chip liquid cooling is designed to solve this problem by targeting heat at the source.
With its GB200 NVL72 system, Nvidia offers:
- 40x higher revenue potential
- 30x greater throughput
- 25x better energy efficiency
- 300x water efficiency compared to air-cooled setups
This is possible because the liquid cooling loop transfers heat away from the chips more efficiently, even allowing the system to use warmer water. This reduces the need for energy-hungry chillers and supports better cooling with fewer environmental impacts.
Nvidia identifies four main types of heat rejection used in modern data centres:
- Mechanical chillers
These systems use vapor compression to cool water. While reliable, they consume large amounts of energy, increasing both costs and carbon emissions. - Evaporative cooling
These designs cool through water evaporation and can be more energy-efficient than chillers. However, they require vast amounts of water—millions of gallons per MW annually—making them unsuitable in dry or humid climates. - Dry coolers
These systems reject heat into ambient air without using water. Though eco-friendly, their effectiveness drops in hot weather unless paired with hardware that tolerates higher operating temperatures. - Pumped refrigerants
These setups use liquid refrigerants instead of compressors. They’re ideal for edge deployments and areas with water limits, offering both water and power savings—if refrigerant use is properly managed.
Nvidia’s liquid cooling delivers the best of all worlds—high efficiency, low water use, and reduced energy costs. This makes it a strong choice for AI-driven data centres under pressure to meet performance goals while cutting environmental impact.
Nvidia Leads AI Infrastructure Innovation
At GTC 2025, Nvidia’s CEO Jensen Huang introduced a full suite of technologies built to support the AI revolution. “AI has made a giant leap,” he said. “Reasoning and agentic AI demand orders of magnitude more computing performance.” To meet that demand, Nvidia is aligning its chips, cooling systems, and data centre designs into one powerful platform.
Their industry partnerships are key to this transformation:
- Vertiv developed reference architecture that cuts energy use by 25%, reduces rack space by 75%, and lowers the power footprint by 30%.
- Schneider Electric supports racks up to 132 kW using liquid cooling, improving both scalability and performance.
- CoolIT Systems offers high-density coolant distribution units that deliver 2MW of cooling at just 5°C approach temperatures—perfect for Nvidia’s new GB300 systems.
Beyond these hardware solutions, Nvidia is helping shape the future through the COOLERCHIPS programme, a U.S. Department of Energy-backed initiative. It focuses on modular, liquid-cooled data centres that could slash costs by 5% and improve efficiency by 20% over older systems.
And Nvidia isn’t stopping there. The company is shifting AI supercomputer manufacturing to the U.S. with help from partners like TSMC and Foxconn. This move strengthens supply chains and pushes innovation even further, ensuring Nvidia stays ahead in the AI race.
As demand grows, Nvidia’s liquid cooling tech offers a path forward—one that balances high performance with sustainability. By reducing water and power use, these systems pave the way for a new generation of AI-ready data centres built for both speed and resilience.