Cloud computing and the Internet of Things (IOT) are both taking off, putting pressure on cloud-based and other centralized data centers to keep up with demand in terms of traffic and processing horsepower. That’s giving rise to the concept of edge data centers, as we’ve mentioned in some previous posts, and also the need to come up with innovative cooling solutions for them.
These edge data centers can take many forms depending on their exact role and where they’re located. They could be in remote locations, such as an edge data center supporting a series of cell towers or on an oil rig. They could be in a smart building or a hospital, occupying maybe one room and supporting a number of floors. In general, they are fairly small, usually between 1 and 10 IT racks and drawing less than 100kW of power.
Because of their small size and diverse form factors, we’re starting to see a number of requirements emerge in terms of the cooling systems that support edge data centers.
Security is always a big issue in any data center but since the purpose of an edge data center is to bring computing power closer to end users, this creates an additional risk. Rather than living behind secured doors as in a traditional data center, now IT assets may be sitting exposed in the corner of an office building, a hospital room, or in a busy retail environment. Each presents concerns ranging from purposeful tampering to someone inadvertently bumping a server. With respect to cooling systems for these edge data centers, it means things like being able to easily service the units without accidentally harming any IT gear.
Other issues include:
- The amount of floor space a cooling system takes up, given that edge environments tend to be very small spaces.
- Lead time to install the cooling solution; given their small size, the expectation is that edge data centers should be deployed rapidly.
- Redundancy: it’s become clear from our customers that, given their role in reducing latency, in data aggregation and improving overall performance, companies can’t afford to have their edge data centers go down.
- Room for growth: While today we aren’t seeing particularly high rack densities in edge data centers, it stands to reason that requirements will only grow as demand increases. As a result, cooling systems will have to be able to modulate from low to high density.
Among the cooling solutions that can help address these requirements are overhead or wall-mount cooling, where the cooling system either hangs from the ceiling, sits on top of the IT racks, or is mounted on an interior wall. The idea is that cooling doesn’t require any additional floor space.
Total liquid cooling (TLC) is another option. With this approach (discussed in some detail in this blog post), the IT gear is literally immersed in a dielectric fluid or mineral oil solution that absorbs heat. This method often requires no large mechanical compressors and cooling infrastructure. Instead, it can rely mainly on “free” outdoor air to cool the liquid and some small pumps to move it back and forth.
Among the Schneider Electric cooling solutions suitable for edge data centers are our SmartBunker units, a thermally insulated enclosure that offers protection against fire, flood, humidity, vandalism and EMF effects.
Prefabricated modular data centers are another solution, and can include any combination of IT racks, power and cooling that a customer needs. And traditional cooling systems that Schneider Electric has long offered, such as InRow and Uniflair, can also often be retrofitted to be suitable for an edge environment. The SmartBunker FX comes with a preinstalled InRow cooling system, for example.
As you build out your edge computing strategy, consider what your cooling requirements are today and how they will likely change in the future. That should help drive your decision on which cooling solution makes the most sense for you. To learn even more about edge computing, take a look at this free white paper, “The Drivers and Benefits of Edge Computing.”