This audio was created using Microsoft Azure Speech Services
As companies of all stripes flock to take advantage of cloud computing, they are learning that the strategy is not without its downside. In many cases, cloud computing is a centralized architecture and relies on companies having one or more reliable connections to the data center where the cloud infrastructure lives.
For some companies, that kind of centralized single point of failure is unacceptable from a risk perspective. To get around it, one solution is to use edge computing to augment that central cloud platform.
Edge computing involves placing computing power, control, storage and applications closer to the end users who are using them. It can be implemented in a private network or as part of a larger Internet-based or cloud computing architecture.
Strategic use of edge computing can transform a cloud computing implementation from a wholly centralized architecture to a more distributed one. That means any disruption would be limited to just that point in the network where it occurred, instead of the entire cloud implementation.
With a traditional centralized cloud architecture, a distributed denial of service (DDoS) attack or a power outage at the cloud data center could render applications unavailable for all users. But when using an edge architecture, any such outage would be limited to the edge computing device and the local applications on that device.
Companies have a few options for how they may employ edge computing in order to bring more resiliency to a cloud implementation.
One is something of a hybrid strategy using simple devices to support specific, defined applications. An example is a cloud storage gateway, which is a local device that acts as a network appliance or server that translates cloud storage application programming interfaces (APIs) such as SOAP or REST. Using such a gateway enables users to integrate cloud storage into their applications without actually moving the applications into the cloud.
A step up from there is to employ small, micro data centers of 10 racks or fewer. Such data centers are often available as pre-engineered, configure-to-order systems that can be assembled on-site. Or, they may be fully prefabricated micro data centers assembled in a factory in a single enclosure and simply dropped on site. Such single-enclosure systems also come in ruggedized versions that can withstand harsh environmental conditions. Either way, the idea is to replicate some applications and storage from the centralized cloud platform for added resilience and improved response times for local users.
A third option is to employ a series of regional data centers. These are larger facilities, with 10 racks or more, that have more processing power and storage capacity than a micro data center. But the idea is the same: to enable cloud applications and storage to live closer to the users who are employing them, to improve response time while also adding another layer of reliability and resiliency.
Cloud computing brings some significant advantages to companies of all sizes, including increased agility and, often, lower cost. Combining cloud with an edge computing strategy can bring further benefits, including improved reliability and faster response times.
To learn more, download the free APC by Schneider Electric white paper number 226, “The Drivers and Benefits of Edge Computing.”