I spent some time last week at the Data Center World conference in Orlando, hanging out in the Schneider Electric booth and attending a few sessions. Along the way I got to chatting with Joe Capes, Schneider Electric’s Business Development Director for Cooling in the Americas, on the sorts of issues he was hearing about at the event and in the market in general.
Since we had some very nice video equipment on hand, we decided to record a discussion highlighting his thoughts on some pressing issues. You can view the video in its entirety below but I’ll provide a quick summary here.
Managing supply and return airflow in a data center
I was a bit surprised to learn that effectively managing supply and return airflow is still an issue in data centers. Containment systems have been around for quite some time now, and have proven to be effective at keeping hot and cool air streams separate. But apparently not everyone is sold on containment systems, particularly for highly dynamic data centers such as colocation environments, where they frequently need to get into the racks to make changes.
Customers with perimeter cooling systems are having problems getting that cool air to where it needs to go. “We have customers who have mentioned that they’re suffering from not being able to get enough static pressure under the raised floor in the center of the data center, because that tends to be the point farthest away from the cooling unit,” Capes says.
The solution is to shorten the distance of both your supply and return air path – think in-row cooling – and to explore whether containment will work to mitigate air mixing. “Surprisingly, it’s one of the more basic components of data center design but probably still one of the most challenging,” Capes says.
Data center temperatures are often cooler than necessary
It’s been a couple of years now since ASHRAE revised upwards its guidance for how warm data centers can be with no ill effects on IT equipment. The message is getting through – sort of.
“The good news is I’ve seen design engineers move to about a 75 degree inlet air temperature. Historically we’ve seen a lot of data centers at 68, 69, 70 degree supply air temp, which really does not afford much opportunity to drive efficiency,” Capes says. “But at 75 on up to 81 degrees, which is the maximum recommended guideline from ASHRAE, you can really start to get some benefit of additional hours for economization.”
Rethinking how to deal with humidity in data centers
Customers are also starting to reconsider how they deal with a related issue, controlling data center humidity. More customers are now going with a centralized dehumidification/humidification approach.
“There’s an economy of scale you can achieve by going to a centralized humidification system,” Cape says. “You do lose some of the redundancy vs. having humidifiers in each individual cooling unit. It’s a balance between whether you’re trying to optimize your cost or get maximum redundancy.”
The centralized approach can help customers who rely on economizers to more effectively deal with periods of high humidity. Capes talks of customers having to bring in temporary dehumidifiers on some days; a centralized approach would help in such instances, he says.
Economizers in Abu Dhabi
Speaking of economizers, earlier that morning I heard an attendee ask a question about whether it’s feasible to use economizers in places like the Middle East. So I asked Capes for any rules of thumb on where economizers can be used effectively.
“You can use an economizer anywhere in the world, really,” he says. “If you look at EcoBreeze as an example, we designed that product around the worst possible conditions globally, which happens to be Abu Dhabi. And even in Abu Dhabi we can do 60% partial free cooling by using indirect evaporative.”
In North America, Capes says economizers are viable in 38 to 40 states, pretty much anywhere except the Southeast where conditions are often too warm and humid – as it was indeed in Orlando, even in October.
“The question comes down to cost vs. benefit,” Capes says.