Data center liquid cooling continues to gain momentum based on its ability to deliver more efficient and effective cooling of high-density IT racks. Yet, data center designers and operators have lacked data that could be used to project the impact of liquid cooling on data center efficiency and help them optimize the deployment of liquid cooling for energy efficiency.
The full analysis was published by the American Society of Mechanical Engineers (ASME) in the paper, ‘Power usage effectiveness analysis of a high-density air-liquid hybrid cooled data center’. This post summarizes the methodology, results, and key takeaways from that analysis.
Methodology for the data center liquid cooling energy efficiency analysis
For our analysis, we chose a midsize (1-2 megawatt), Tier II data center in Baltimore, Maryland. The facility houses 50 high-density racks arranged in two rows.
The baseline for the analysis was 100 percent air cooling provided by two chilled water, perimeter computer room air handler (CRAH) units with hot-aisle containment. The cooling units are supported by a Vertiv™ Liebert® AFC chiller with free cooling, adiabatic free cooling, hybrid cooling, and adiabatic mechanical cooling capabilities.
Liquid cooling is enabled by direct-to-chip cooling through micro channel cold plates mounted on major heat-generating IT components and supported by two Vertiv™ Liebert® XDU coolant distribution units (CDUs) with liquid-to-liquid heat exchangers.
The analysis employed a “bottom up” approach by disaggregating the IT load into subsystems that enabled the impact of a progressive increase in the percent of the load cooled by liquid to be accurately calculated for each subsystem.
We then ran four studies, increasing the percent of liquid cooling in each study while also implementing optimizations to chilled water temperature, supply air temperature, and secondary inlet temperature enabled by the use of liquid cooling.
- Study one: 100 percent air cooling with a chilled water temperature of 7.2 degrees Celsius (45 Fahrenheit), supply air temperature of 25 C (77 F), and secondary inlet temperature of 32 C (89.6 F).
- Study two: 61.4 percent of the load is cooled by liquid with 38.6 percent cooled by air. Chilled water temperature is raised to 18 C (64.4 F), supply air temperature is maintained at 25 C (77 F), and secondary inlet temperature is maintained at 32 C (89.6 F).
- Study three: 68.6 percent of the load is cooled by liquid with 31.4 percent cooled by air. Chilled water temperature is raised to 25 C (77 F), supply air temperature is raised to 35 C (95 F), and secondary inlet temperature is maintained at 32 C (89.6 F).
- Study four: 74.9 percent of the load is cooled by liquid and 25.1 percent by air. Chilled water temperature is maintained at 25 C (77 F), supply air temperature is maintained at 35 C (95 F), and secondary inlet temperature is raised to 45 C (113 F).
Impact of the introduction of liquid cooling on data center energy consumption and PUE
The full implementation of liquid cooling in study four (74.9 percent) produced an 18.1 percent reduction in facility power and a 10.2 percent reduction in total data center power compared to 100 percent air cooling. This has the effect of not only reducing data energy costs by 10 percent annually, but for data centers using carbon-based energy sources, reducing Scope 2 emissions by the same amount.
Total data center power was reduced with each increase in the percent of the load cooled by direct-to-chip cooling. From study one to two, power consumption was cut by 6.4 percent; an additional 1.8 percent reduction was achieved between study two and three; and another 2.5 percent improvement was seen between study three and four.
Based on those results, the data center PUE calculated for each study may prove surprising. The PUE fell only 3.3 percent, from 1.38 in study one to 1.34 in study four and actually remained flat at 1.35 for studies two and three.
If you’re familiar with how PUE is calculated, you may already have guessed the reason for this discrepancy. PUE is essentially a measure of infrastructure efficiency that is calculated by dividing total data center power by IT power. But liquid cooling didn’t just reduce consumption on the facility side, it also reduced IT power consumption (per the PUE definition) by reducing demand on server fans.
Server fan power consumption decreased by 41 percent between study one and study two, and 80 percent between study one and study four. This resulted in a seven percent reduction in IT power between study one and four.
Unlike air cooling, liquid cooling effects both the numerator (total data center power) and the denominator (IT equipment power) in the PUE calculation, which makes it ineffective for comparing the efficiency of liquid and air-cooling systems. Total Usage Effectiveness (TUE) is a better metric for this purpose.
The TUE for the data center that was the subject of our analysis improved 15.5 percent between study one and four, which we believe is an accurate measure of the gains in data center efficiency achieved through the optimized liquid cooling deployment.
Key takeaways from the data center liquid cooling energy efficiency analysis
The analysis provided multiple insights into the efficiency of data center liquid cooling and how it can be optimized. I’d encourage data center designers, in particular, to read the full paper, which includes the supporting data that was used to derive the results stated in the previous section. Here are some of the key takeaways that might be of interest to a broader audience.
- In high-density data centers, liquid cooling delivers improvements in the energy efficiency of IT and facility systems compared to air cooling. In our fully optimized study, the introduction of liquid cooling created a 10.2 percent reduction in total data center power and a more than 15 percent improvement in TUE.
- Maximizing the data center liquid cooling implementation ‒ in terms of the percent of the IT load cooled by liquid ‒ delivers the highest efficiency. With direct-to-chip cooling, it isn’t possible to cool the entire load with liquid, but approximately 75 percent of the load can be effectively cooled by direct-to-chip liquid cooling.
- Liquid cooling can enable higher chilled water, supply air, and secondary inlet temperatures that maximize the efficiency of facility infrastructure. Hot water cooling, in particular, should be considered. Secondary inlet temperatures in our final study were raised to 45 C (113 F) and this contributed to the results achieved while also increasing opportunities for waste heat reuse.
- PUE is not a good measure of data center liquid cooling efficiency, and alternate metrics such as TUE will prove more helpful in guiding design decisions related to the introduction of liquid cooling in an air-cooled data center.
Finally, I want to thank my colleagues at Vertiv and NVIDIA for their work on this groundbreaking analysis. The findings not only quantify the energy savings that can be achieved through liquid cooling but provide designers with valuable data that can be used to optimize data center liquid cooling installations.
For more on the trends driving adoption of liquid cooling, see the blog post, Liquid cooling: Data center solutions for high-density compute, that summarizes the insights from a panel of liquid cooling experts at the 2022 OCP Global Summit.
More from Vertiv
Discussing hybrid design in hyperscale data centers
Sponsored Vertiv: What’s in store for 2023
Energy alternatives, regulation and efficiency, Vertiv’s top data center trends for 2023
Sponsored Sustainability at (hyper)scale
How colocation data centers can help hyperscalers meet sustainability goals through new and innovative technology