Can Old Data Centers Learn New Tricks?

Sets,Of,Cooling,Towers,In,Data,Center,Building.

As artificial intelligence (AI) has become increasingly important, there has been a larger need to dedicate data center space to the training and hosting of new, highly demanding AI applications. But these applications aren’t like those that have traditionally been run in “production” data centers. They require more power. They take up less floor space. They require higher rack densities. And they need more effective and efficient cooling.

The AI data centers of tomorrow are nothing like today’s “production” data centers. Are data center owners and operators capable of designing these new data centers to power their AI applications? Can their existing data centers be retrofitted to adequately meet these needs? And is help available if they’ve answered “no” to either of these questions?

We recently sat down with Steve Altizer, the President of White Space Integration experts, Compu Dynamics, to discuss what’s necessary to make “production” data centers power AI solutions, and why White Space Integration partners become essential as data centers get more complex.

Steve AltizerThe Modern Data Center Journal (MDCJ): We’ve previously talked about the difference between an AI data center and a traditional “production” data center. But what are some of the largest differences that our readers need to know about?

Steve Altizer: By most accounts, AI data centers will demand uncommonly dense power delivery systems, radically new liquid cooling systems, and enhanced network infrastructure. They will also demand a fraction of the physical floor space of a traditional data center tenant.

This is the cost of progress. AI will bring enormous value to the world but take its toll by forcing developers, engineers, and contractors to rethink how data centers are deployed.

In the short run, users will attempt to use the capacity that is available. Buildings that are under construction today may end up being redesigned on the fly or modified – when possible – to accommodate AI customers. However, new designs that are created specifically for AI workloads will soon emerge.

One example of this is CyrusOne’s IntelliScale data center. None of these facilities have been built to our knowledge, but it’s clear that CyrusOne is attempting to step out in front of its competitors by catering to extremely dense IT environments.

MDCJ: What about from a construction standpoint? What differences are we seeing when it comes to construction?

Steve Altizer: The White Space Integration – the creation of the data halls that will house the data center equipment – is becoming far more complex than ever before. It’s no longer simply installing busways, or remote power panels (RPPs) and whips. The liquid cooling necessary for modern AI data centers requires that chilled water be piped directly into the data hall.

“If a data center already has the requisite chilled water plant and piping loop, then it will be relatively easy to extend that loop into the data hall…But even that isn’t particularly “easy.” The design of these secondary systems will still be complex and expensive.” – Steve Altizer

It is not an option to simply allow the General Contractor’s electrical subcontractor to customize the data hall for an AI end user. The amount of coordination and experience is massively different. The individuals handling the White Space Integration must be familiar with mechanical systems, as well as electrical systems.

This is why we’re seeing a drastic increase in the importance of a White Space Integration partner. These firms understand the requirements. They know which materials and equipment to use. They have the knowledge and experience to handle these more complex projects. That means the data center owner and operator won’t have to worry about leaky pipes in the data hall spraying cool water all over their equipment.

MDCJ: These new AI data centers are clearly much more complex. So, how heavy of a lift would it be to convert from a “production” data center with chilled air to an AI data center that leverages liquid cooling?

Steve Altizer: First, it’s important to note that chilled air cooling has different flavors. And, depending on which flavor of chilled air they’re using, the switch to liquid cooling could be easier or more difficult. The difference lies in the fluid used to chill the air – or remove the heat – from the inside of a building to the outside.

When modern data centers started popping up 10-15 years ago, refrigerant-based Computer Room Air Conditioner (CRAC) systems were the predominant cooling method. As energy performance was elevated in importance and customers began to consider a facility’s Power Utilization Effectiveness (PUE), engineers started incorporating more efficient solutions into their designs that utilized liquid refrigerant and chilled water.

Today, some of the country’s premier data centers are cooled with refrigerant-based systems. However, the majority of newer sites are using air-cooled water chillers, which is beneficial. Today’s chilled water designs are far more adaptable when fluid must be delivered out onto the data hall floor to the IT cabinet. The heat transfer system and exterior equipment can remain largely intact.

So, those who built data centers using chilled water plants will find it easier to retrofit than those that were built with refrigerant systems, alone. Buildings that utilize refrigerant-based CRAC systems exclusively will be much more expensive to retrofit because an entirely new chilled water plant and distribution system will need to be created, and that may be cost-prohibitive.

If a data center already has the requisite chilled water plant and piping loop, then it will be relatively easy to extend that loop into the data hall, where it can be connected directly to high-density racks and IT gear.

“Today, the ideal scenario for a data center owner or operator is to identify the purpose of the data center well ahead of time, and design and build the facility to meet the demands of the IT gear.” – Steve Altizer

But even that isn’t particularly “easy.” The design of these secondary systems will still be complex and expensive.

MDCJ: You mentioned cost. Exactly how costly of a process is this? And how long can this process take?

Steve Altizer: Data center owners and operators need to prepare themselves for the financial impact ahead. Retrofitting a data center for liquid cooling can be very cost-prohibitive.

I recently had a conversation with a noteworthy cloud company that is investing heavily in AI. That company has resigned itself to the fact that the emergence of this new technology will necessitate the design and construction of entirely new data centers. That opinion was based on the assumption that it would simply be too costly to retrofit their existing data centers for AI – even those that are under construction today.

At those existing data center locations where retrofitting is an option, the time it takes to modify the cooling system will not be a problem. If the data center owner or operator is working with a trusted White Space Integration partner, it should be feasible to implement a chilled water distribution system in a timeframe that is reasonably consistent with the installation of a white space power distribution system.
MDCJ: Why is it cheaper to design and build with liquid cooling in mind than to install it after the building is complete and the General Contractor hands over the keys?

Steve Altizer: Today’s data centers typically split the IT area – or the data hall – from the power and cooling equipment area – or the gallery. Galleries are often built with large unitary cooling products or using fan wall systems. If an end user is planning to absorb more than 50kW per cabinet, it’s very likely that direct-to-chip systems, rear door heat exchangers, or both will be required.

In some scenarios, the traditional gallery equipment will be completely unnecessary. In other scenarios, only a small percentage of those systems will ever be needed.

Today, the ideal scenario for a data center owner or operator is to identify the purpose of the data center well ahead of time, and design and build the facility to meet the demands of the IT gear. If the facility will serve AI needs, it may be wise to switch gears early and avoid the cost of installing major cooling systems that will ultimately not be needed.

MDCJ: If the White Space Integration team works hand-in-hand with the General Contractor early in the process, does it deliver benefits when installing liquid cooling systems?

Steve Altizer: We are finding that involving the White Space Integrator early in the process is extremely useful to the data center owners and the General Contractors. I can speak from my own experience as a White Space Integration partner on many data center projects.

For example, yesterday, we worked with a colocation operator to help define a chilled water distribution system for an AI tenant. At this point, we don’t know where this tenant will end up, geographically, but we have already solved a number of headaches for the colocation operator by helping to specify and design an appropriate cooling distribution system.

“The White Space Integration – the creation of the data halls that will house the data center equipment – is becoming far more complex than ever before.” – Steve Altizer

Next, we will provide pricing input for that project so the colocation provider can work the white space infrastructure expenses into their offer to the tenant. This includes equipment selection and procurement assistance recommendations, collaborating with the operator’s architecture and engineering team, and helping define the implementation schedule.

As usual, this colocation operator will be able to rely on our expertise to reduce their risk in delivering a fully commissioned system so the rent payments can begin to flow.

Since we have decades of experience with both mechanical and electrical systems, we will offer to deliver the entire white space, fully integrated and ready for occupancy. We will do this under a contract with the colocation operator, their General Contractor, or occasionally with the end user or tenant.

The data halls of tomorrow will be completely different from the data halls of today. Networks of chilled water piping will be interlaced with busways, conveyance, and other critical building systems. We believe it is smart to retain the services and expertise of a qualified White Space Integrator as soon as the architecture and engineering team is engaged to create the IT environment.

We can provide the maximum value, and ensure that power, cooling, and network systems are optimally installed for their intended use and for ongoing maintainability. We provide fully coordinated, fully compatible, fully integrated AI white space solutions anywhere our clients want to go.

To learn more about the data center requirements of advanced AI solutions, click HERE to download a copy of the eBook, “The Impact of AI: How the Latest Tech Obsession is Changing the Data Center as We Know It.”

Scroll to Top