Artificial intelligence (AI) has become an integral part of everyday life. Chatbots such as ChatGPT and Google Gemini are turning computers and smartphones into all-rounders and changing the lives of millions of people worldwide. In companies, using AI promises a great economic advantage, because it increases efficiency and productivity.

According to the McKinsey Global Institute (MGI), there is a lot of potential in generative AI in particular. According to the institute's forecast, corresponding tools theoretically enable an annual increase in productivity of 2.6 to 4.4 trillion US dollars worldwide. However, AI places new demands on the data centers in which the individual applications and services run. Developers and operators need to rethink and redesign the infrastructure of the buildings.

Different approaches to cooling

First of all, AI solutions require significantly more computing power. To meet the demand, servers with faster processors are used. These AI computers offer more computing power in the whitespace, the server room, and in a smaller footprint. But the compression of computing power generates enormous heat. This is why the cooling of the racks also plays a decisive role because heat dissipation is not possible with traditional air cooling. The reason: air can only absorb a certain amount of heat and the new load exceeds its capacity.

Data center operators therefore need a different solution, liquid cooling, where water or another contact fluid flows through the racks. Many professionals consider it an enabling technology for AI in data centers.

After the integration of liquid cooling, there is also potentially more usable space available in the whitespace. Because the technology takes up less space to distribute heat, operators can place additional racks and compute power.

Liquid cooling is also very convincing when it comes to sustainability. From the very beginning, the water always remains in the closed circuit. With Direct Liquid Cooling, the temperature of the waste heat also rises. It can therefore be used even better. This makes heat transfer between the water and district heating more efficient than air cooling. Less energy is required, which reduces the CO2 footprint of data centers.

Tailoring the overall architecture

The trend towards AI is fundamentally changing the design of data centers. Developers and operators must take liquid cooling of the servers into account as early as the building planning stage. As a rule, it requires its own and thus an additional water cycle. In short, the design of the cooling system in the whitespace will have a significant influence on the overall architecture in the future. Unlike before, IT infrastructure, cooling, and supply technology will soon be closely linked in data centers. In addition, all racks must be prepared or converted for the technology in preparation for liquid cooling.

So far, there is no standard for the construction of AI data centers. The industry still has to do a lot of testing and find a consensus. Currently, some companies in the industry are testing practical solutions for AI data centers to better understand the specific requirements. Standardization helps to accelerate the expansion of the necessary infrastructure through clear specifications.

New construction versus conversion

In the future, it will not only be a matter of building completely new AI data centers. Alternatively, operators can also modernize existing buildings. If they already have a separate water circuit for cooling, this makes the change easier. Otherwise, the buildings must first be gutted before they receive further structural adjustments.

But this approach makes perfect sense. Building a new data center usually costs more time and money in comparison. In the case of conversion, on the other hand, some of the permits have already been obtained and the approval effort and time are reduced. The success of the project depends on whether developers and operators are able to make the changes easily and quickly. Overall architecture in the style of a hall offers plenty of space for free design after gutting. Overall, the effort must remain within reasonable limits compared to a new building.

However, a conversion does not automatically lead to more computing power in data centers. The respective local power supply often limits this and slows down further innovation. Operators can only equip whitespace with AI servers and computing power to the extent that the power supply allows. However, obtaining more electricity via another connection often fails due to insufficient infrastructure. The feasibility depends primarily on the geographical location of the buildings and on the question of whether they can be connected to the electricity grid according to demand. In the case of unsuitable initial conditions, conversion is not worthwhile.

An example of a proven design

Representatives of the data center industry around the world are currently planning to modernize their infrastructure for AI applications and services. NDC-GARBE has been relying on its patented concept of the Green IT Cube for ten years. The concept for energy-efficient and environmentally friendly high-performance computing in data centers already provides a separate water cycle for cooling. Heat exchangers on the rear doors of the racks ensure temperature compensation. The hot air from the servers is fed directly through the aforementioned heat exchangers and cooled back. The advantage is that no air distribution in the room is necessary.

In addition to saving energy, this also saves height in the white space of the data center. Although the concept was developed at a time when AI was not yet a big topic, it is proving to be future-proof. In addition, preparations for direct liquid cooling are already integrated into the Green IT Cube. For this reason, the existing concept requires only minor adjustments to meet the new requirements.

Embarking on innovation

Exciting times are ahead for the data center industry. The development of AI data centers is still in its infancy and it remains to be seen exactly what the optimal solution for liquid cooling will look like. In principle, NDC-Garbe focuses on making buildings flexible so that both air and liquid cooling can fit in. The adjustments to the infrastructure are challenging, but they also lay the foundation for the AI-supported digital future. In the coming years, the industry will work hard to meet the increasing demand for AI computing power.