This audio was created using Microsoft Azure Speech Services
Liquid cooling is not a new technology. It’s been around for decades and has historically focused on mainframes, high performance computing (HPC), and gaming applications. But today’s demands for IoT, artificial intelligence, machine learning, big data analytics, and edge applications is once again bringing it into the limelight for data center design. We’re hearing more and more in the media about liquid cooling for data centers, primarily because servers are demanding high-power GPUs and CPUs to meet their business’ processing needs. These chips are reaching thermal design power (TDP) of 400W now. When a rack is heavily populated with these kinds of servers, the rack density can exceed levels that are cost effective and practical for air-cooling. The Green Grid suggests a range of 15-25kW/rack as the limit for air cooled racks “without the use of additional cooling equipment such as rear door heat exchangers.”
Research-backed Analysis of Liquid Cooling for a Data Center Design
No doubt, these rising chip and rack densities are a key driver for liquid cooling. But there are other reasons you may want to consider the technology. My colleagues Paul Lin and Tony Day recently released a white paper that discussed this very topic. In this paper, they discuss five reasons to adopt liquid cooling. We’ve already discussed the first, so here are the remaining four:
- Pressure to reduce energy consumption
- Space constraints
- Water usage restrictions
- Harsh IT environments
Download White Paper 279:
Five Reasons to Adopt Liquid Cooling
Pressure to Reduce Energy Consumption
Energy consumption of data centers represents a growing percentage of our global energy. This has prompted regulations and corporate initiatives requiring power usage effectiveness (PUE) and overall energy consumption reductions. Next to the IT systems themselves, cooling system energy is the biggest energy consumer in data centers. Liquid cooling has been proven as a more efficient cooling approach than conventional air cooling, in part because of a significant reduction in IT fan energy, ranging from 4-15%. Our preliminary analysis suggests overall energy reduction of over 10% with immersive liquid cooling, as compared to a conventional packaged chiller-cooled data center. With numbers like this, it’s an architecture that shouldn’t be ignored.
It is important to consider the amount of physical space needed to house not only the IT equipment, but the cooling infrastructure that supports it. As densities rise, rack count may go down, but the ratio of physical space dedicated to air cooling equipment increases, diminishing the gains of the higher density racks. With liquid cooling, you have an opportunity to reduce the overall data center footprint for a given IT load through significant compaction. This can be a significant benefit for large data centers or colocation providers that may have a desire to expand in space constrained regions like Singapore and Hong Kong.
Water Usage Restrictions
With conventional air cooling, high volumes of water are often used for evaporative cooling to achieve PUEs in the sub-1.2 range. A 20MW data center consumes the equivalent water of 2,500 people. That’s pretty significant! Not only does water consumption increase operational costs, but many local municipalities are putting pressure on the data center industry in geographies with water resource constraints. Liquid cooling reduces, and often eliminates, water usage from the cooling system. Since most liquid cooling approaches use warm water directly to the IT, simple dry coolers can be used in most climates to reject the heat.
Harsh IT Environments
More and more, we are seeing IT equipment deployed in non-ideal edge environments – IoT in manufacturing facilities, warehouses, distribution facilities, industrial applications. These environments often present challenges in terms of airborne contaminants, the ambient conditions, and the quality of the power. When standard IT is deployed in these conditions, this can result in lower reliability than anticipated. As the IT becomes more integrated with manufacturing and other processes, downtime can have a big impact on the bottom line. Ruggedized enclosure solutions exist with integrated air cooling, but depending on the environment, can be less efficient and costlier. Liquid cooling represents an alternative that separates the servers from the environment. With certain liquid cooling approaches, fans are removed, and airborne contaminants are completely isolated from the IT equipment.
And yet, even more benefits of liquid cooling.
The paper mentions some additional benefits of liquid cooling as well. These may not drive people to switch from air cooled to liquid cooled, but they represent important advantages once you switch.
- Minimal heat added to the space – With immersive liquid cooling, over 95% of the heat is removed, meaning a comfortable working environment in the IT space.
- Fans are eliminated – Not only does this mean less energy as discussed earlier, but this eliminates health risks caused from fan noise, and also reduces risk of IT failures caused by fan failures.
- Waste heat recovery – The hot water used to remove the heat from the chips provides practical recovery of waste heat which can be used for facility or district heating. This can have a significant impact on the opex and overall carbon footprint of the facility.
- Layout flexibility – With air-cooled IT equipment, hot/cold aisle arrangement with containment is best practice for airflow management. Liquid cooling provides more flexibility to arrange equipment.
- Geography not as important – Since liquid cooling uses warm water, full economization can be achieved in most parts of the world.
Read Paul & Tony’s white paper, Five Reasons to Adopt Liquid Cooling, to get more details on what I’ve highlighted here. I am convinced that liquid cooling will become more mainstream for data centers and edge computing in the future. Are you?
Leave a comment below or check out other blog posts from the Data Center Science Center team.