The Fog: Direct VS Indirect Edge Computing

This audio was created using Microsoft Azure Speech Services

Fog computing” has received an increased amount of attention in the IoT & IIoT communities. I briefly reviewed the Fog in “IoT & the Pervasive Nature of Fast Data & Apache Spark”. In this paper, I identified Fog as part of the evolution of low latency analytical solutions. However, Fog is a rather nebulous term especially when used interchangeably with the term “Edge computing“. This has led to many discussions regarding the desire to rename or clarify the terminology.

I recently had one such discussion with Dez Blanchfield (@Dez_Blanchfield) via Twitter on this very topic. Dez proposed, to the Twitterverse, renaming Fog to something more suitable. The result being my suggestion of “Indirect Edge”. Why Indirect Edge? I did not want to not replace “Fog” with a nicer sounding albeit equally vague term that obfuscates purpose and meaning. I instead sought to denote and distinguish the core concepts of the associated underlying architectures by identifying.

  1. Where does the Edge end and the Fog begin?
  2. What purpose does each serve?
  3. What are the differences?
  4. Why are they so similar?

Let’s first address the origin of term “Fog”. Fog is generally considered to be a bad marketing term originally coined by Cisco. It is understood to have been developed to drive sales by making “…its internet routers into edge computing devices”. The introduction of intelligent Cloud integration and compute scenarios to the Edge led to Fog’s inception. Thus despite the original sales slant, of Fog, it provides value by denoting some architectural variations of Edge when addressing Cloud computing. Moreover; the IoT community is not beholden to the term “Fog” nor any particular vendor’s rendition of it. In fact; Fog shares many similarities to Edge and it is fair to view it as an architecture pattern of Edge computing; thus the proposal to use the term ”Indirect Edge” vs “Direct Edge”.

A key tenet of each architecture is low latency data processing. Both achieve this by not requiring data be sent to the Cloud or remote data-center for analytics. Each takes place left of the Cloud integration layer at the devices’ network level in Figure 1. Edge computing stores and processes data on the devices themselves or a device gateway. Not surprisingly Edge computing is generally less robust than a Cloud solution, due to the limited compute power of devices. I refer to this as the “Direct Edge” or we may say just the Edge. However, Fog computing takes place on a node near the Edge but not on the devices themselves i.e. indirectly at the Edge. The Fog introduces the practice of selectively sending data to the Cloud as needed.

Fog Computing High Level Architecture
Fog Computing High Level Architecture

 

Figure 1

We now understand these patterns are very similar and utilize the Edge devices directly and indirectly as a key component of the architecture. These similarities are actually one of the reasons for much of the confusion surrounding Fog. One key deviation of Fog is that it leverages the concept of a Cloudlet. A Cloudlet is similar to a device Gateway in so much that it resides near the Edge devices. In an Edge topology, a gateway typically serves as an aggregation point. In an Indirect Edge scenario the gateway would be a more powerful server, VM or Cloudlet. Cloudlets are more powerful, store more data, and perform more robust analytics than edge devices or gateways. A subset of an IoT Platform’s services could be deployed to the Cloudlet for example. Data can further be selected for intelligent distribution to an IoT Platform in the Cloud.

Fog’s ability to selectively send data to the Cloud allows us to implement Edge and Cloud scenarios. For example, some algorithms may only require few data points. There is no reason to send all of the data to the Cloud to perform these analytics. This is costly and increases the latency of results. The algorithms may instead process the data and deliver real-time results accordingly at the Edge. If more robust storage or deeper analytical processing is needed, we can selectively route data to the Cloud. Cloud based IoT platforms backed by Data Lakes are a likely destination for this data. In this hybrid scenario we can achieve real-time analytics at the Edge and Enterprise platforms can capture this data for deeper interactive analytics.

As we have seen, the Indirect Edge a.k.a Fog architecture pattern is closely related to Edge with a few differences. There are some distinct architectural considerations when addressing Indirect Edge versus Edge scenarios. Using a term such as Fog is too vague and requires too much explanation when discussing solutions. Whether we choose the proposed terminology herein or something better we should promote discussion and strive for clarity.

 

Tags: , , , , , , ,