Edge computing continues to grow at a rapid pace. In industrial environments this has been driven largely by Industry 4.0 efforts. But when critical IT equipment is placed at the edge of the network in harsh environments like warehouses, industrial plants, or even outdoors, deploying that equipment in a way that ensures the systems are safe, secure, operate reliably, and with their maximum life expectancy, requires considerations beyond that of a more traditional IT space. In our recent white paper 278, Three Types of Edge Computing Environments and their Impact on Physical Infrastructure Selection, industrial/harsh environments is presented as one of the three types. In the paper, we discuss the challenges IT equipment faces in these harsh environments. We also cover the best practices and options for deploying physical infrastructure to support them. Here’s the short version…
Industrial and harsh environments
These harsh environments exist at the edge for every industry. Just a short list of examples includes an automotive manufacturing plant, a retail warehouse or distribution center, an oil and gas refinery, a military battlefield, a telecommunications base station, or a train signaling control system. As you can imagine, there’s a spectrum of “harsh”, depending on the specific environment you are deploying the edge computing systems in. Not all harsh environments are the same. The potential risk factors that exist include:
- Temperature / humidity highs, lows, and fluctuations
- Water / leaks
- Particles / dust
- Vibration from industrial machinery
- Collision from heavy mobile machinery
- Nuisance events like animals chewing on electrical lines
In the paper, we discuss each of these risks in detail, but here, I’ll just summarize a few: (1) temperature, (2) water risks, and (3) particles / dust.
Temperature / humidity risks
Temperature and humidity extremes and rapid changes can impact the reliability and life of the IT equipment. If the IT equipment is needed in a location that has this exposure, a dedicated air conditioner may be necessary to keep the equipment regulated. This can be done inside of a micro data centers – basically, a self-contained, secure computing environment at the local edge, that is typically a single rack.
Water exposure risks
When determining the best approach for protecting IT equipment, its important to consider the likelihood and severity of water damaging the loads. If they are in a warehouse or a factory near or under a water main line, you may want to ensure some sort of umbrella or “hood” to protect the micro data center. If your equipment will be outdoors, this is where more robust enclosures should be considered. IEC standard 60529, also called the Ingress Protection or the IP code, classifies and rates the level of protection against water (and dust which I’ll cover next). The figure below shows the ratings. Ratings go from unprotected up to being completely submerged, with a wide range in between. Cost increases as you go up in ratings, as a general rule, so it’s important to assess the risks and specify accordingly.
Dust and particles
Airborne contaminants are common in harsh environments, and these can impact the reliability and life expectancy of the IT equipment. Just like water, the level of protection needed depends on the likelihood and severity of the risk. As the figure shows, there are varying degrees of protection from particulates of different sizes. An IEC rating of 5 is often requested in harsh industrial settings and a rating of 6 is often requested for outdoor protection.
In North America, enclosures are classified with ratings by NEMA, which are similar to IEC. A NEMA rating of 4 is water-tight and dust-tight, for indoor or outdoor use. Schneider Electric offers a micro data center designed in an enclosure that is NEMA 4 rated, specifically for harsh edge environments. Some 3 phase UPS are also designed to be protective from dust. For instance, the Easy UPS 3S has replaceable dust filters to protect the UPS in harsh environments.
Everything I’ve mentioned until now assumes standard IT equipment is being used in a harsh environment and presents ways of reducing risks by hardening or fortifying the environment surrounding it. There is an alternative, ruggedized IT equipment. The IT equipment itself can be designed to be temperature/humidity tolerant, waterproof, dust proof, and so on… although it comes with tradeoffs.
What about addressing the risks with ruggedized IT instead?
- Price premium – Ruggedized IT costs more than standard IT. The premium amount is largely dependent on the specs the equipment is built to. When they have higher ratings and tolerances they cost more.
- Limited performance – This type of IT equipment is often completely sealed and fan-less. This means components are relying on convection and radiation for cooling the components. This constrains how much compute power you can get. If multiple devices become necessary vs. one standard IT device, the economics will likely favor the single ruggedized enclosure.
- Supplier base – Ruggedized IT is not offered by everyone, but rather by boutique specialist server manufacturers that make equipment for military or are a division of a large vendor. Low-end server vendors are often not at play here, so the solutions are likely to be of higher quality.
- Service life and TCO – If your IT is refreshed every 3-4 years, which it often is, ruggedizing the IT vs. the enclosure means you’re paying the premium every 3-4 years. You’ll also need ruggedized support infrastructure like UPSs and rack PDUs that can handle the same conditions as the IT. With a ruggedized enclosure, depending on the level of protection it provides, the IT systems can be standard.
In the debate of ruggedized IT vs. ruggedized enclosures/micro data centers, the latter is generally more cost effective. Check out the white paper, Three Types of Edge Computing Environments and their Impact on Physical Infrastructure Selection, for more details about the risks and considerations in designing a micro data center suitable for these harsh environments.
To continue the technical discussion, make sure to visit the Industrial Edge Computing Forum.
Leave a comment below or check out other blog posts from the Data Center Science Center team.