Close this search box.

Liquid Cooling is Coming to a Data Center Near You

By Steve Altizer, President – Compu Dynamics, LLC

The future is finally here. Applications such as AI are changing the game when it comes to servers and the power they consume. With the increased power density, racks are generating more heat and putting greater pressure on cooling systems to ensure efficient and effective operations.  

What our industry is saying

Recently I attended the DICE conference and no kidding, in less than two minutes after picking up my nametag I heard the phrase “liquid cooling”. Over the course of the next day and a half, those two words were spoken in sequence about every minute or two by other attendees or presenters.

Our industry has been patiently waiting for years to see a noticeable adoption of liquid cooling. That time may have come. We seem to be close to the precipice of a major shift in server, and thus, data center, cooling.

The shift to liquid cooling is happening faster than a Tesla Plaid hits 60 mph, and it all appears to be due to AI. Experts are saying that AI workloads will make the Cloud look like a 1980’s era Intel 286 PC. AI will have 10x the impact of the Cloud on data centers”, said one colocation operator recently said. Wait, what?

Air cooling systems have met their match

Since the birth of the commercial internet, air cooling has been the standard. While technologies evolve, the basic concept of air cooling has remained the same. That is until now. This new paradigm shift, caused by AI, has the potential to put air cooling in the Smithsonian.

AI is raising power consumption from 200-400 W/SF that we see today to upwards of 2000 W/SF. Air cooling has been able to keep up with technological advances up to this point but rejecting heat from a room full of 50kW-100kW IT cabinets is too heavy a lift. Fortunately for us, an eager group of liquid cooling pioneers has been waiting patiently for this very moment.

What is liquid cooling?

Liquid cooling uses a liquid, such as water, mineral oil, or liquid refrigerant, rather than air alone, to remove heat from data centers. Some of these new technologies allow the cooling solution to be brought closer to the heat source, thus requiring less, if any, fan power. Liquid cooling conducts more than 3,000 times as much heat as air and requires less energy to do so, allowing increased data center densities.

At present, there are three categories of liquid cooling:

Direct-to-chip cooling sometimes referred to as direct-to-plate cooling, integrates the cooling system directly into the computer’s chassis. The cool liquid is piped to cold plates which sit directly atop components such as CPUs, GPUs, or memory cards. Small tubes carry the cool liquid to each plate, where the liquid draws off the heat from the underlying components. The liquid is then circulated to a heat rejection device or heat exchanger. After the heat is removed from the fluid, it is then circulated back to the cold plates.

Rear-door heat exchangers provide a similar concept to the rack level. A heat exchanger is mounted on the back of the rack in place of its back door. Server fans blow the warm air through the cooling coil, which dissipates the heat. Alternatively, some “active” rear doors are equipped with fans that draw air through the door’s cooling coil. The liquid is circulated through a closed-loop system which carries out to a heat exchanger or out of the building.

Immersion cooling, the newest technology, involves internal server components submerged (“immersed”) in a nonconductive dielectric fluid. The components and fluid are encased in a sealed tub-like container to prevent leakage. The heat from the IT components is transferred to the coolant, a process that requires less energy than other approaches.

Where do we go from here?

Here’s a big question facing the industry—Can current colocation facility designs adapt to such a significant shift in power density?

The answer is both maybe and maybe not. Consider this. When a legacy data hall is occupied by a few hundred 17.2 kW/cabinets, you can almost play a pick-up game of football in the empty space. AI cabinets are likely to run even hotter in the 50-100kW per rack density range. Today’s colocation providers are probably looking at every potential liquid cooling solution to adapt to tomorrow’s AI-oriented customers. If they don’t, the big GPU-centric users may be forced to build their own facilities.

One of the features we love about this industry is the constant change. In the past, we’ve been able to catch our breath as we adapted to those changes. This time, we might not have that luxury.

I’m proud to say that at Compu Dynamics, we’ve been involved with liquid cooling since 2016. Our team of dedicated industry professionals have met with several of the leading manufacturers of all three forms of liquid cooling so we can provide the best advice and service to our clients. We look forward to partnering with AI and other high-performance computing end users to help them determine which solution best fits their current IT requirements while taking into consideration tomorrow’s ever-evolving demands.


Keep up with the latest from Compu Dynamics with our eNewsletter

This field is for validation purposes and should be left unchanged.
Scroll to Top