• Network topology showing edge nodes and devices. Source: Alibaba Cloud 2020.
• Illustration of a CDN with distributed edge servers. Source: Fraser 2019.
• Cloudlet use cases, plus comparing cloudlet with cloud. Source: Satyanarayanan et al. 2009, fig. 4.
• Comparing performance on edge device, edge server and AWS datacentres. Source: Ha et al. 2013, figs. 9-10.
• Predicted value of edge computing by 2025. Source: Chabas et al. 2018, fig. 1.
• Benefits of edge computing. Source: Mhetre 2018.
• Edge, fog and cloud computing. Source: Sunkara 2019.
• Physical comparison of three development kits for edge computing. Source: Yau 2019.

# Edge Computing

arvindpdmn
1491 DevCoins

Pravardhan
446 DevCoins
2 authors have contributed to this article
Last updated by arvindpdmn
on 2020-07-03 06:56:15
Created by Pravardhan
on 2020-01-05 10:19:33

## Summary

With the growth of the Internet of Things (IoT) billions of devices are generating huge amounts of data. But to store or analyze all that data in real time is almost impossible. This is where edge computing becomes relevant.

With edge computing, we process data closer to the source, such as an IoT device, an IoT gateway, an edge server, a smartphone, or a user's computer. The idea is to move intelligence to the edge and let edge nodes/devices do real-time analytics. This reduces application latency and saves network bandwidth. The architecture is distributed, as opposed to centralizing all processing in the cloud. It minimizes long-distance client-sever communication.

Gartner defines edge computing as,

part of a distributed computing topology where information processing is located close to the edge, where things and people produce or consume that information.

## Milestones

1988

Mark Weiser at Xerox PARC coins the term Ubiquitous Computing. Unlike desktop computing, the term implies that computing can happen in any device, in any location. It can happen in laptops, mobile phones, sensor devices, and everyday objects such as refrigerators, umbrellas or alarm clocks. In the late 1990s, a similar term Pervasive Computing is coined.

Aug
1998

Akamai is incorporated "to intelligently route and replicate content over a large network of distributed servers." This is what we call a Content Delivery Network (CDN). Akamai delivers first live traffic in February 1999 and launches commercial service in April. Yahoo! becomes one of their customers. By 2019, Akamai is said to have 240,000 edge servers, many of which are located in ISPs or mobile data towers. A user request is served by the closest available edge server that also caches content.

1999

Napster is launched as a file sharing software to download and distribute music. The common way to download on the internet is to connect to a server and download files to the client machine. Napster is a peer-to-peer (P2P) network in which machines are both clients and servers. One machine can download parts of file from another nearby machine. This distributed architecture makes P2P network scalable, faster and resilient to failures. This idea of cooperative file sharing can be traced to USENET (1979).

2006

Amazon launches Amazon Web Services (AWS). It's Elastic Compute Cloud service enables users to use datacentre infrastructure to run programs. The same year Google Docs is launched. Spreadsheets and documents can be edited and saved online. This is the beginning of cloud computing. However, Saleforce pioneered this model back in 1999.

2009

Satyanarayanan et al. coin the term cloudlet. They recognize that network latency hurts user experience for highly interactive applications. A response time of more than a second becomes annoying. Mobile devices are also resource constrained and relying on the cloud is too slow. Cloudlets are a solution. They provide the compute power for nearby edge devices. They connect to edge devices via low-latency high-bandwidth one-hop wireless links.

2011

Myoonet pioneers the concept of modular micro datacentres. A year later AOL follows this trend with indoor micro datacentres for enterprises.

2012

With IoT in mind, and to handle real-time low-latency applications, Cisco engineers coin the term fog computing to imply a distributed cloud infrastructure. Their paper is titled Fog Computing and Its Role in the Internet of Things. Fog computing encompasses edge processing plus the network connections that bring data from the edge to the cloud. In 2015, the OpenFog Consortium is founded. In 2019, this merges with Industrial Internet Consortium.

2013

Ha et al. note the emergence of new resource-intensive interaction-intensive applications: face recognition, speech recognition, object and pose identification, mobile augmented reality, and physical simulation and rendering. In their experiments they find that face recognition can be done on an average server one hop away from the edge device. But speech recognition is more intensive and would need cloud processing unless the edge server is upgraded.

Oct
2016

The First IEEE/ACM Symposium on Edge Computing is organized in Washington DC. In May 2017, there's IEEE International Conference on Fog and Edge Computing in Madrid, Spain. In June 2017, there's IEEE International Conference on Edge Computing in Hawaii. These events highlight the growing interest in edge computing.

2025

In 2019, Gartner research shows that only 10% of enterprise data was created and processed outside cloud infrastructure. They predict that by 2025 this will increase to 75%. This underscores the importance of edge computing. In another research, McKinsey estimates the value of edge computing to be $175-$215 billion by 2025.

## Discussion

• Could you explain edge computing with an example?

Consider a building security application that has many networked cameras. Cameras capture high-definition video streams. Assume that the cameras are 'dumb'. They simply capture and transmit raw video to a cloud server. The cloud server analyzes the video streams for motion detection. We note two problems in this scenario: strain on network bandwidth and strain on the cloud server that has to process videos from multiple cameras.

With edge computing, we enable each camera to run motion detection locally. Cameras are equipped with some storage and sufficient compute power to do this. Cameras have now become 'smart'. Only important video segments are sent to the cloud for storage or deeper analysis. Both network bandwidth and cloud storage/processing are saved. In turn, the cloud server can now support many more cameras.

• What are some use cases of edge computing?

In autonomous driving, if a pedestrian crosses the road, the vehicle may have to brake immediately. Waiting for the cloud to make this decision may prove fatal. Vehicles can also communicate directly with one another.

In healthcare, there are glucose monitors, fitness trackers, and other health-monitoring wearables. Some monitoring devices locally analyze pulse data or sleep patterns without involving the cloud. Edge computing enables timely care for remote patient monitoring, in-patient care and healthcare management in hospitals.

In smart factories, the lower latency due to edge computing enables more timely actions to control manufacturing workflows. If analytics determines that a machine is about to fail, immediate action can be triggered to stop the machine. Robots that process their own data will be more self-sufficient and reactive. Edge computing enhances safety and efficiency on the factory floor.

In agriculture, connectivity is an issue at remote locations. This leads to high investment towards fibre, microwave or satellite connections. Edge computing presents a more cost-effective alternative. Yield can be improved and food wastage reduced.

Many more use cases exist in other sectors such as consumer electronics, defence, telecom, oil & gas, energy, retail and finance.

• Which are the main benefits of edge computing?

These are some benefits of edge computing:

• Reduced Latency: For delay-critical applications, the longer it takes to process data, the less relevant it becomes. Edge computing avoids roundtrip delay.
• Better Security: Centralized cloud systems are vulnerable to DDoS attacks. Edge computing allows us to filter sensitive information locally.
• Cost Savings: By retaining and processing data closer to source, edge computing saves on connectivity costs. Data can be categorized and handled suitably: stored locally, stored in cloud, or processed and discarded. Redundant storage is avoided. Data management solutions also cost less.
• Greater Reliability: Connectivity to the cloud is never perfect. Edge devices/nodes can deal with temporary outages by storing or processing data locally.
• Scalability: While cloud infrastructure is built for scalability, data still needs to be sent to datacentres and stored centrally. Edge computing complements this by scaling in a distributed manner.
• Interoperability: Edge nodes can act as intermediaries to interface legacy and modern machines.
• Which are the technologies that enable computing at the edge of networks?

Cloud computing itself is an enabler for edge computing. Edge computing doesn't replace cloud computing. They complement each other. Based on application requirements, engineers must decide what data is best processed at the edge and what should go into the cloud.

Fog computing extends the cloud while also being closer to edge devices. With fog computing, we place compute and storage in the most logical and efficient location between the cloud and the origin of data. As part of the fog computing infrastructure, there are cloudlets and micro datacentres, which are simply edge servers clustered together to serve local storage or compute requirements. They are suited for resource intensive or interactive applications.

Multi-Access Edge Computing (MEC) is another enabler. In cellular networks such as 4G/5G, Radio Access Network (RAN) refers to the part that handles radio or wireless resources and communication. MEC places compute and storage resources within the RAN to improve network efficiency and content delivery.

More generally, dropping cost of electronics has brought more compute power to smaller devices. Tools for real-time analytics, particularly on embedded devices, have also made edge computing possible.

• How does edge computing differ from fog computing?

Sensors, IoT devices and even IoT gateways are edge devices. On the other hand, edge nodes belong to the fog network that interfaces edge devices to the cloud. Therefore, fog computing depends on edge computing but the reverse is not true.

For example, consider a train fitted with sensors. A sensor giving engine status is processed locally on the train. This is edge computing. Suppose data from sensors attached to wheels and brakes need to be aggregated over time and processed. These can be sent to an edge node (fog computing) without saving that data in a centralized cloud.

There's no universal definition of fog computing. Some regard edge devices, edge nodes and even localized datacentres as belonging to the edge network, not preferring to use the terms fog computing or fog network.

Some others regard edge computing as any processing that's close to the origin of data. Any processing that happens on devices connected to the LAN or on LAN hardware is seen as fog computing. Investing in fog computing makes sense if data has to be aggregated from many edge devices.

• What hardware is available to implement edge computing?

Important considerations for edge computing hardware include processing power, power source, memory, wireless connectivity, variety of ports/interfaces, reliability and ruggedness.

For the edge, RISC processors such as ARM, ARC, Tensilica, and MIPS are preferred over CISC. While ARM Cortex is suitable, ARM also offers Neoverse specifically for edge use cases. ARM Cortex-M55 and Ethos-U55 are AI edge computing chips.

NVIDIA Jetson GPUs are designed for the edge. For example, Jetson Nano is a 128-core GPU. Jetson TX2 and Jetson Xavier are for industrial and robotic use cases. There's also NVIDIA EGX Platform that offers GPU edge servers.

Intel has Movidius and Myriad 2. The latter is also part of Intel's Neural Compute Stick (NCS) that draws power from host device via USB.

Mainflux Labs offers MFX-1 IoT Edge Gateway. Huawei's Atlas AI Computing Platform of AI accelerators, edge stations and servers is based on its Ascend AI processors. Scale Computing offers HC3 Edge and HE500 systems. APC's EcoStruxure Micro Data Centre promises physical security, standardized deployment, and remote cloud-based monitoring. There are many more focusing on micro datacentres.

• How are cloud providers enabling edge computing?

Google uses TPUs in its datacentres. In 2018, it started offering Edge TPU, an ASIC for AI inferencing at the edge. TensorFlow Lite models can run on an Edge TPU.

Microsoft's Azure Stack Edge is a cloud-managed edge appliance with compute, storage and intelligence. Processors are Intel Xeon. For machine learning workloads, Intel Arria10 FPGA is the hardware accelerator.

AWS has a number of services for the edge. AWS Outposts extends AWS infrastructure and services to any other datacentre or on-premise facility. AWS Snow Family offers small portable rugged edge devices to be deployed as close as possible to sensors collecting data. AWS Wavelength brings single-digit millisecond latencies to mobile devices and end users. It does this by bringing AWS compute and storage to telecom datacentres at the edge of 5G networks.

## Milestones

1988

Mark Weiser at Xerox PARC coins the term Ubiquitous Computing. Unlike desktop computing, the term implies that computing can happen in any device, in any location. It can happen in laptops, mobile phones, sensor devices, and everyday objects such as refrigerators, umbrellas or alarm clocks. In the late 1990s, a similar term Pervasive Computing is coined.

Aug
1998

Akamai is incorporated "to intelligently route and replicate content over a large network of distributed servers." This is what we call a Content Delivery Network (CDN). Akamai delivers first live traffic in February 1999 and launches commercial service in April. Yahoo! becomes one of their customers. By 2019, Akamai is said to have 240,000 edge servers, many of which are located in ISPs or mobile data towers. A user request is served by the closest available edge server that also caches content.

1999

Napster is launched as a file sharing software to download and distribute music. The common way to download on the internet is to connect to a server and download files to the client machine. Napster is a peer-to-peer (P2P) network in which machines are both clients and servers. One machine can download parts of file from another nearby machine. This distributed architecture makes P2P network scalable, faster and resilient to failures. This idea of cooperative file sharing can be traced to USENET (1979).

2006

Amazon launches Amazon Web Services (AWS). It's Elastic Compute Cloud service enables users to use datacentre infrastructure to run programs. The same year Google Docs is launched. Spreadsheets and documents can be edited and saved online. This is the beginning of cloud computing. However, Saleforce pioneered this model back in 1999.

2009

Satyanarayanan et al. coin the term cloudlet. They recognize that network latency hurts user experience for highly interactive applications. A response time of more than a second becomes annoying. Mobile devices are also resource constrained and relying on the cloud is too slow. Cloudlets are a solution. They provide the compute power for nearby edge devices. They connect to edge devices via low-latency high-bandwidth one-hop wireless links.

2011

Myoonet pioneers the concept of modular micro datacentres. A year later AOL follows this trend with indoor micro datacentres for enterprises.

2012

With IoT in mind, and to handle real-time low-latency applications, Cisco engineers coin the term fog computing to imply a distributed cloud infrastructure. Their paper is titled Fog Computing and Its Role in the Internet of Things. Fog computing encompasses edge processing plus the network connections that bring data from the edge to the cloud. In 2015, the OpenFog Consortium is founded. In 2019, this merges with Industrial Internet Consortium.

2013

Ha et al. note the emergence of new resource-intensive interaction-intensive applications: face recognition, speech recognition, object and pose identification, mobile augmented reality, and physical simulation and rendering. In their experiments they find that face recognition can be done on an average server one hop away from the edge device. But speech recognition is more intensive and would need cloud processing unless the edge server is upgraded.

Oct
2016

The First IEEE/ACM Symposium on Edge Computing is organized in Washington DC. In May 2017, there's IEEE International Conference on Fog and Edge Computing in Madrid, Spain. In June 2017, there's IEEE International Conference on Edge Computing in Hawaii. These events highlight the growing interest in edge computing.

2025

In 2019, Gartner research shows that only 10% of enterprise data was created and processed outside cloud infrastructure. They predict that by 2025 this will increase to 75%. This underscores the importance of edge computing. In another research, McKinsey estimates the value of edge computing to be $175-$215 billion by 2025.

## Tags

• Multi-Access Edge Computing
• Computational Storage
• Cloudlet
• Streaming Analytics
• Distributed Computing
• Content Delivery Network

Author
No. of Edits
No. of Chats
DevCoins
3
1
1491
3
0
446
2052
Words
2
Chats
6
Edits
0
Likes
337
Hits

## Cite As

Devopedia. 2020. "Edge Computing." Version 6, July 3. Accessed 2020-08-12. https://devopedia.org/edge-computing
• Site Map