Three key enablers of IOT and edge computing resilience

This audio was created using Microsoft Azure Speech Services

More data was generated in 2017 alone than over the previous 5,000 years of human history combined. This explosion of data owes much to the Internet of Things (IoT) phenomenon and is forcing enterprises to enhance their ability to derive and analyze business relevant information from this vast pool of data in order to drive business growth. As more and more devices are connected, both data centers and edge nodes (an edge node can be a single server, gateway, or other device with computational capabilities that gathers and processes local data and is connected to a larger network) will need to be resilient and provide the fundamental physical infrastructure that enables analytics-related real-time decision support.

How did we evolve to this point? Over the last several years, a wave of technology trends changed both data center design and operation practices and allowed for the emergence of a new arena of data processing: the edge computing environment. Since new technology builds on the shoulders of existing technology, an analysis of how we got here provides insights to where we are heading. Here’s a quick look at three key enablers.

1. Modularity and scalability 

Modularity offers customers more flexibility and a less expensive alternative to redundancy. If power, cooling and compute “chunks” can be thought of as building blocks, then you have a near infinite number of possibilities. The size of the block can be changed or it can be configured as N+1, 2N, or N depending on needs and budget. If a data center has two power feeds, for instance, one can be made redundant while the other is not redundant. Resilience has moved to the software layer and modularity provides the basis for rapid adjustment and cost savings. Modularity’s close cousin, scalability, acts as a hedge against uncertainty. The ability to sustain rapid growth requires scalable infrastructures that enable high deployment speeds.

2. High availability 

The “always on” mindset of the data center will now migrate to edge environments. Predictive maintenance, analytics, and new levels of insight, allow data center and edge stakeholders to be more risk averse and to minimize the physical infrastructure required to support applications.  If a battery or component within an Uninterruptible Power Supply (UPS) gets replaced before it fails, for example, multiple levels of redundancy don’t necessarily need to be built in. Therefore, stakeholders can realize CAPEX efficiency gains on the front end. A lean design also lowers OPEX as the asset matures. However, the big change that is coming is at the edge. Traditional edge environments are generally characterized by a low availability state. That’s about to change. The new, more affordable software driven resiliency provides a solid backbone for higher availability at the edge.

3. Management simplification through digitalization

Data center infrastructure management (DCIM) is migrating from on-premise to the cloud. Now, simpler, more powerful interfaces allow for connections to multiple clouds thereby enriching the data pool for better management of data processing operations. These benefits are relevant to the new distributed assets on the edge where human intervention is limited. The notion of upgrading the edge with new technologies is like a developing nation that leapfrogs to a new technology without having experienced the previous stage (like a country that skips building a landline infrastructure and goes directly to mobile phones). In this case, the edge will leverage digitalization technologies to enable a “lights out” remote support approach that does not require on site expert staff.

Other key technology trends

Other trends like energy storage (where lithium-ion batteries are replacing lead acid in many UPSs), liquid cooling, more open technology architectures (like EcoStruxure IT and the Open Compute Project) and deeper technology partnerships open doors to more flexibility and lower costs for those organization deploying data center and edge computing environments.

It’s an exciting time of change for both data center and edge environments. We’ve moved from corporate data centers to specialized companies that build hyperscale and colocation data centers. Now a second great pillar, the edge computing environment is emerging. Both play a critical role is supporting the high speed of global business.

Understand how edge is supporting the next evolution today 

Learn more about how edge computing can address critical availability requirements by downloading the Schneider Electric white paper “Why Cloud Computing is Requiring us to Rethink Resiliency at the Edge” or view this short video on data center resiliency.

Tags: , , , , , , , , ,