Edge And Fog Computing: Their Practical Uses

23645
edge computing

This article is based on a talk by Chetan Kumar S. at India Electronics Week 2019, organised by EFY Group, where he informed how Fog and Edge computing can be used to solve the problem of latency.

One popular application of Edge computing is city surveillance. Many surveillance cameras are coming up. For instance, Mumbai has thousands of CCTV cameras, and it plans to install many more. All these cameras will get traffic footage, which will help in identifying mobility problems and criminal activities. But it is practically impossible for security personnel to monitor them all. So, one option is to use computer vision and machine learning.

The machine learning process is related to data. In terms of camera surveillance, data from the cameras needs to be collected and processed to perform various operations. On bandwidth requirement, a single camera needs a speed of about 3Mbps for streaming an SD video and 5Mbps for an HD video. An SD video consumes 32GB data in a day, which means ten cameras will consume 1.7TB data per month. And with further increase in the number of cameras, data consumption will also increase.

Why we need Edge computing and where it can be used

While cost of the CPU and storage is one part of cloud computing, bandwidth is another significant part. Bandwidth cost is not just in terms of pushing data to the cloud, it also includes cost of the servers and storage space.

(Credit: www.thinkebiz.net)
(Credit: www.thinkebiz.net)

If you want to avoid the issue of storing a large amount of data in the cloud to reduce cost, the best way to do that is to process it near the edge of the network, where data is being generated. Hence, the term Edge computing.

To automate this using computer vision and machine learning algorithms, a system is required rather than a human being to process data. For example, this way, data from the cameras at a traffic signal can be processed then and there itself.

Edge computing can be used in applications that require autonomy (that is, completing tasks with little or no human interaction) such as self-driving cars and Industry 4.0.

Another area where Edge computing may find use is applications that cannot tolerate latency. For example, healthcare and financial transactions where latency can be a reason for system failure.

Latency in the cloud server

For factory automation, which is mostly motion control, an end-to-end latency of one millisecond is needed; whereas, for process automation monitoring, end-to-end latency of fifty milliseconds is required.

In case of the cloud server, end-to-end latency will take around sixty milliseconds, considering approximately 29 milliseconds for one-way latency. So, even if the cloud server is located in Mumbai and you want to automate your factory in, say, Bengaluru, it will only take a maximum of sixty milliseconds.

There is latency because of the large number of devices connected with the cloud also. Latency also has a physical limitation based on the number of kilometres the fibre link runs. It cannot go down below a certain threshold, which means, for factory automation excessive latency may not work at all in cloud infrastructure.

To solve this problem of latency, Edge and Fog computing can be introduced in the Internet of Things (IoT). There are three laws that explain why we need Edge computing, as explained below.

  • Law of physics, which says act locally
  • Law of economics, which says pre-processing reduces cost
  • Law of land, which says data should be stored locally

As per research done by Gartner, fifty per cent of enterprise data will be produced and processed outside traditional data centres and clouds by 2022—up from about ten per cent until recently. Also, eighty per cent of enterprises would shut down their traditional data centres by 2025, versus ten percent in 2018. This indicates that computation is moving towards Edge and distributed systems.

What is Edge computing

Edge computing is the optimisation of cloud, to move the compute close to the data source. It is not a new thing that came up in the last two years; instead, it is something that started thirty or so years ago. People gradually moved from mainframes to microcomputers (PCs) and to cloud computing. Subsequently, computing started to become cheaper. Then, people started to realise that work done on the cloud can also be done on the Edge, where bandwidth is comparatively cheaper. And as mission-critical applications started coming in, latency became a problem.

While Edge computing refers to delivery of computing capabilities of a network to improve performance, operating cost and reliability of applications and services, Fog computing is a distributed computing concept where compute and data storage resource, as well as applications and their data, are in an optimal place between the user and cloud to improve performance and redundancy.

One of the things that these commonly-used terminologies essentially have is a single node, or a compute node, where data processing takes place. It can be called Edge node. There are basically a set of nodes or multiple nodes called Fog nodes. In Edge computing terms, Fog computing is known as Cloud Edge.

A use case of smart surveillance

A smart surveillance system was deployed for mobility management and citizen safety in an industrial township, which extends around 3.2 square kilometres. Initially, there was a network of 200-plus IP cameras connected over a high-speed optical link. All data went to a centralised place, where data processing took place.

Data processing was done manually before AiKaan Labs deployed a smart solution. What they wanted was Fog infrastructure to support multiple video analytics applications. It was required to take care of mobility management (or traffic management) and safety of the people in the township. This is where Fog computing played a huge role. It supported video analytics software at the central station and made it easier for the vendor to run the various applications.

In another use case involving steam boilers, the steam boiler vendor wanted to optimise the operation and save up to fifteen per cent cost. To achieve this, data collection was done and, subsequently, locally processed using a PLC.

Challenges in deploying Edge and Fog computing

Some major challenges people face while applying Edge and Fog computing include:

  • Remote connectivity and debugging, where multiple devices need to be identified and connected.
  • Model, firmware and data upgradation, as video analytics require machine learning model updates and some gateways need firmware upgrades.
  • Lack of trained personnel to manage the devices (both Edge and Fog) due to complexity of systems and technical barriers.

Chetan Kumar S. is chief executive officer and co-founder, AiKaan Labs

This article was first published online on 24 May 2019 and can be read here.