Rack densities, AI workloads and energy costs keep climbing. Air cooling works well in some situations, but high-performance computing equipment requires liquid cooling. Data centers typically offer a mix of cooling technologies. Here’s where things stand with liquid cooling.
According to Zayo, data center network bandwidth purchasing surged by 330% in 2024, driven largely by AI workloads in hyperscale data centers.1 Meeting the capacity for data transfer that AI needs means leveraging hybrid IT infrastructure that not only delivers low-latency performance but also balances power efficiency, geographic flexibility and seamless connectivity.
Liquid cooling technology isn’t new, but it’s quickly becoming a go-to heat management solution. High-density, compute-intensive workloads such as AI model training and inference throw off a lot of heat that needs to be managed efficiently. And this often requires liquid, not air, cooling.
Our previous blog on liquid cooling cites a 2024 State of the Data Center finding that the estimated average mean rack density to be 12kW per rack. Things have quickly changed in a year! A June 2025 Forbes article says that the servers which use GPUs (along with CPUs and DPUs optimized for AI model training) can consume more than 20 times the power of standard Intel-based CPU cloud servers – and output 20 times more heat per server.
The latest NVIDIA-based GPU servers require 132 kW of power. The next generation, expected in under a year, will require 240 kW per rack, according to an article in Forbes authored by Steven Carlini, Vice President of Innovation and Data Center, Energy Management Business Unit, Schneider Electric.2
Grand View Research forecasts growth in the U.S. data center liquid cooling market at a CAGR of 21.6% from 2025 to 2030.1
We’re seeing a ripple effect from higher densities, and data centers are adjusting their cooling strategies accordingly. For good reason. Efficient cooling is a significant factor in a data center’s operating costs; cooling accounts for approximately 40% of a data center’s energy consumption.3 While liquid cooling can help drive greater overall efficiency, keep in mind that air cooling will always be a key part of data centers because all the ancillary equipment (UPSs, switchgear, PDUs, etc.) always will need to be cooled.
Unlike air cooling, which relies on air conditioners or air handlers to circulate chilled air, liquid cooling solutions use liquid to absorb and move heat away from computing equipment.
Commonly used fluids include water, glycol mixes and dielectric fluid. The choice of liquid depends on several factors, including the environment to be cooled, the heat load and the liquid cooling technology – more on this below. Here’s how water and other liquids compare:
Liquid cooling is typically implemented as a recirculating system or an immersion system. Recirculating systems work well for low-to-medium rack density, and immersion systems for high rack density. Recirculating systems such as direct-to-chip (DTC) cooling and rear door heat exchangers (RDHX) are closed-loop solutions in which the fluid absorbs heat and moves it away from IT equipment.
In the DTC method, metal plates with high conductivity properties, known as cold plates, are attached to CPUs and GPUs. In single-phase DTC solutions, the coolant (water or another liquid) flows through the plates, absorbs heat and moves it to a heat exchanger or cooling distribution unit. The heat is released, and the coolant recirculates. In two-phase solutions, the coolant in the cold plates heats up to the point where it turns into vapor, releasing heat that is taken into a condenser. In the condenser, the vapor turns back into liquid, which recirculates.
In this evolving landscape, power availability guides where AI infrastructure is built, while dark fiber connectivity ensures it can scale effectively and operate seamlessly.
Immersion cooling is just that. Servers are immersed in special tanks in a non-conductive fluid (dielectric is common), which absorbs heat. The fluid can be single-phase, which retains its liquid form and is pumped to a heat exchanger, or two-phase, which turns into vapor, condenses and returns to the tank. Single-phase systems can handle lower rack densities, and two-phase systems are used when kW in a rack reaches 50 or more. Servers are pulled out of the tanks for maintenance. As an aside, immersion cooling is mostly popular in Asia-Pacific region but currently not very prevalent in the U.S.
Rear door heat exchangers work like radiators mounted on the back of a server rack. The hot air from the servers passes through the RDHX, in which a chilled coolant (usually water) absorbs and removes heat, enabling cooled air to leave the servers. The cooled air that exits into the room helps lower the use of air cooling. The warmed coolant is pumped to a cooling system like a chiller, and the cycle begins again. Exchangers are fairly easy to install, so they can be added incrementally.
Air cooling and liquid cooling together can enable the best of both worlds. The hybrid approach is popular because data centers can apply the most efficient technology to manage specific rack densities and heat loads. Air-cooled systems are designed for rack densities up to approximately 15 kW, while liquid cooling systems are effective for rack densities up to 200+ kW. Here’s why hybrid systems make sense.
In cooling solutions that use sensors, AI can analyze data in real time to improve outcomes:
AI’s monitoring capabilities and alerts augment the redundancies built into cooling systems. Even a minor glitch can cause a waterfall of problems such as overheating, mechanical failure, data loss, energy usage spikes, unexpected repair bills and more. Experts design and build cooling systems with these issues in mind, but AI on the job can boost confidence.
High-performance computing workloads like AI models drive higher densities, which create more heat – and densities are going only one direction. Air cooling is still a key part of data center cooling, but as rack densities rise, so will the use of liquid cooling. When you evaluate colocation data centers, ask about current and future cooling roadmaps so you understand the provider's strategy and capabilities.
Know More
To learn more, visit our Knowledge Base, which includes informative videos, infographics, articles and more covering energy, economic impact and tech innovation in the data center industry.
You can also watch a video about liquid cooling on the CoreSite YouTube channel.
When you are ready to talk to a CoreSite representative, contact us. We would welcome the opportunity to discuss our approach and experience implementing liquid cooling for high-density workloads.
References
1. Data Center Liquid Cooling Market Summary, Grand View Research
2. Why Liquid Cooling for AI Data Centers Is Harder Than It Looks, Forbes
3. Chill Factor: NVIDIA Blackwell Platform Boosts Water Efficiency by Over 300x, NVIDIA
4. Direct-To-Chip Liquid Cooling, HDR
5. Comparing Data Center Cooling Options, Lubrizol