This audio was created using Microsoft Azure Speech Services
Engineers are often called upon to think differently and solve problems. One of my favorite examples of this creative process is when the ground crew at NASA designed a CO2 removal system using only parts that were on the potentially doomed Apollo 13 spacecraft. Failure just wasn’t an option.
Under tremendous pressure, the NASA engineers came up with an ingenious strategy to get the craft back to earth safely, saving the lives of the three astronauts along for the ride.
When it comes to thinking differently, Schneider Electric and Iceotope took on that mission with an integrated liquid cooling solution, leveraging chassis-level immersion. The liquid-cooled rack solution is designed to manage increased chip density; energy and water consumption; physical space and location constraints; and ever more complex data center environment and climate issues. This technology might not be as famous as cold plate or tub immersion, but it’s more adaptable to today’s needs.
Why Data Center Cooling Needs Innovation
A growing number of applications, including Artificial Intelligence (AI), big data analytics and data mining, require energy hungry Graphic and Central Processing Units (GPUs and CPUs). These chips are becoming increasingly power dense with thermal design power (TDP) ratings reaching 400 watts or more.
Register for the OCP Global Summit to visit our virtual booth – we’re showcasing recent enhancements to our liquid cooling solution. REGISTER >
AI is predicted to permeate practically all software applications. If that prediction is only half true, there are significant implications for data center operators who need to either host “hot chips” at scale or halt these AI-advancements. No data center manager wants to give that bad news to his or her customers; and it’s not just data center operators facing this challenge. All types of organizations are looking to collect and process data in real time at the edge, which poses additional challenges due to the types of environments where IT needs to be placed.
The Very Real Challenges Liquid Cooling Solves
In the past, liquid cooling has been accused of being a solution looking for a problem, as it was marketed outside of High Performance Computing (HPC) applications. Hyperscale operators have been able to achieve incredibly low power usage effectiveness (PUE) with traditional air-cooled servers on sprawling data center campuses, but huge challenges remain and liquid cooling seem to be the most compelling use of fluids removing heat with greater energy efficiency while staying compact!
- Hot chips: their use has contributed to the tipping point around liquid cooling. For example, Intel recommends that its 400W TDP Intel® Xeon® Platinum 9282 Processor to be liquid cooled. You just can’t blow enough air across it. This has been the main challenge that HPC has faced and why it’s relied on liquid cooling for years. The difference is these chips are not just for specialized HPC anymore!
- Water use: Although large data centers have great PUE, most of them use a significant amount of water to achieve it. As the data center industry endeavors to offer sustainable operations, excessive water use must also be tackled. Warm water loops needed for liquid cooling use very little evaporative or adiabatic cooling, so reduce the levels of water used.
- Heat re-use: Despite the industry’s efforts to capture all the heat radiating from data centers and use it elsewhere, this is near impossible with air cooling as waste heat is still too cool. Transforming the heat off the server into a liquid can be good enough for residential district heating or pre-heating for municipal district heat. Europe is leading the way in facilitating this use.
- Efficiency that can scale (down): even if I just said liquid cooling isn’t needed for high efficiency, we do still need it to scale down for smaller edge deployments. Currently, the most cost-effective way to remove heat from a few servers or racks is with DX air conditioners, but this will prove unsustainable with more IT at the edge. If the data center industry does not pay attention to this, regulators will. You can achieve hyperscale level pPUE at the edge very simply with liquid cooling. A wise man once said to me, “Two idiots with a hose can get a pPUE < 1.1 with liquid cooling.”
- Putting IT in weird places: I know what you are thinking . . . when it comes to the edge, we’re seeing more traditional IT in non-traditional locations, making operating conditions more
… Dirty: How to keep IT up and running on factory floors, oil rigs, distribution centers? Immersive liquid cooling by default keeps the bad stuff out and greatly enhances reliability.
… Noisy: If you have been in a data center, you know how difficult it is to hear anything or anyone. Immersive cooling keeps fans out and allows you to interact with colleagues.
… Windy: Some sterile environments can’t have air blowing around; think cleanrooms and labs. Immersive liquid cooling solves this with no fans.
Explore More Virtually at Our Booth
Schneider Electric and Iceotope are presenting their technologies together at the virtual OCP Global Summit, from May 12 – 15. Discover the benefits of chassis-level immersion and liquid cooling technology. Click to connect with our booth of experts – they’ll be available to chat live!
Conversation
Gary Tinkler
5 years ago
Robert, as a business that has supplied NASA to cool their DC with Liquid & Air, it may be worth a chat…..