Blogs

Edge Computing, Simplified!

Rajesh Dangi /September, 2021

If you are not on the Edge, you are wasting space!

A little twist to the original quote of famous Everest climberJim Whittaker, rather apt when it comes to distributed computing reaching the edge.

Today’s cloud computing strategy now spans beyond hybrid clouds and is even leaving the bus of hypervisors and prefers bare-metal deployments for reusable resources via serverless or containers lead microservices running on the edge cloud. The monolithic trend is breaking at a faster pace to augment near-shore points of presence via edge enablement from static workloads to real-time edge processing workloads.

If we look at the contribution and consumption of the content over the internet by mobile users, the video streams, wearables, and even the smart devices, most of it all sits at the perimeter. These applications produce, process, and access time-sensitive real-time data based on events and user interactions pointing at the need for reinventing the deployment architecture in a much more distributed fashion. The expectation thus is to spawn the edge, core, and cloud in a seamless way and enables proximity to ‘things’ with ‘being’s, so to say. Forrester Analytics Global Business Technographics Mobility Survey in 2019 spoke about Edge computing stating 57 percent of the mobility decision-makers say they have edge computing on their roadmap for 2020 onwards.

Gartner predicts that by 2025, three-quarters of enterprise-generated data will be created and processed at the edge – outside a traditional centralized data center (read, core) or the cloud, saying Edge completed the cloud. Edge computing and cloud have a symbiotic relationship they both address different problem sets and workload characteristics, yet they complement each other.

The Evolution of Edge Computing

The basic concept of edge computing can be traced back to the 90s when Akamai launched its content delivery network (CDN). The idea back then was to introduce nodes at locations geographically closer to the end-user for the delivery of cached content such as images and videos. The edge computing concept is thus not new, it is just got better aligned to the principles of decentralized computing with much more focus on the workloads, data flows by and between components, and manageability from a single pane of glass for effective control.

Although use cases are evolving with the technological advancements, we often hear that Edge computing is associated with IoT, AI/ML and Data ingestion, etc. and in the paradigms of sensors, machine to machine and distributed application deployments such as containers, serverless functions, and network function virtualization to name the few. The field is so vast and widespread that there are multiple standards validating this based on the approaches one takes to design, deploy and consume edge computing that is seen interplaying.

With the early emergence of the decentralized computing paradigm, our understanding of Edge Computing has been evolving from various standards and consortiums mentioned below.

  • IEC 61131: Standard for programmable controllers
  • IEC 61499: Standard for distributed programmable controllers
  • IEC 62541: OPC Unified Architecture (OPC UA) as a communication middleware for industrial automation
  • IEEE 1934:Standard for Adoption of OpenFog Reference Architecture for Fog Computing
  • IEEE 802.1: Time-Sensitive Networking (TSN) Task Group related standards
  • IEEE 802.1: Time-Sensitive Networking (TSN) Task Group related standards
  • IEEE 802.1: Time-Sensitive Networking (TSN) Task Group related standards

The changes we observed are two sides of the same coin, one representing technological advancement and the other being the business and workflow transformations. The technological changes bring the emergence of fully digitized manufacturing environments, an extreme increase of actuator and sensor data and connected technical components, as well as the miniaturization of factories due to the acceleration of computing power in edge devices, on the other side the of business fosters accelerated innovation cycle, increased decentralization and unfolding the process hierarchies to augment the future goals of complex business process sustainability and simplifications.

To cite an example to deliberate the Edge Ecosystem, the Consortiums like EECC are making some headways on defining Reference Architecture Model for Edge Computing (EECC RAMEC) for emerging Edge Computing market in smart manufacturing and other Industrial IoT domains. The goal is to define and develop the specification of a Reference Architecture Model, Technology stacks (EECC Edge Nodes) and further to identify gaps and recommendations of best practices by evaluating approaches, alignment with initiatives/standardization organizations, and collaboration with partners engaged in the actionable in the defined digitalization verticals in the EECC REMEC CUBE shown below.

What gives an edge to Edge Computing?

The tenets for the Edge Computing advantages are,

  • Faster Response Times – Closer the workload to the consumer, the better is the experience and response time. The Edge locations near the Internet Exchanges will provide significant advantages, This is the principle CDN providers follow setting up their EDGE POPS, i.e. Point of Presence.
  • Improved Availability – The Edge ecosystem is resilient by design and auto-heal the failures by respawning the microservice-based workloads thus reduces complexities in operations and manageability, once set up the Edge Workloads continue to work. An impact to one Edge Location can also be offset by automated redirection to another Edge running nearby.
  • Interoperability & Scalability – The edge workloads are designed and developed on the distributed computing principles and each application module can scale and run independently on multiple locations thus scalability and interoperability by and between Edge and Core is well taken care of. The portability of the application containers is orchestrated with ease within and outside of the edge location.
  • Security – The significant risk reduction due to the ability to patch the vulnerabilities across all edge locations, constant iteration beyond the initial setup to deal with patching and emerging security issues via automated rollout of the updated of the container images from a central repository, whitelisting the network access via secured VPN tunnels, Encryption of data via public and private keys, etc are some of the measures that make Edge computing secure to operate effectively. Another significant factor is a runtime version of the microservices ensures that the containers are up and running on-demand and will be in an offline state otherwise.
  • Lower Cost of Ownership – The entire Edge stack is all about reusability of resources and optimization of the infrastructure and services thereof, from compute, memory, storage, and network all infrastructure resources are provisioned on the fly and metered only upon utilization thus the pool is universal yet available on-demand to all subscribers hosted on the Edge. This provides enough motivation for Edge adoption.

Edge Computing Value Addition

Edge computing is evolving and making breakthroughs banking on emerging technologies and leveraging the opportunity due to the surge in data volume from the massive number of devices enabled by radio or wireless networks has made edge computing more important than ever before. Besides its abilities to reduce network latency and improve real-time user experience, edge computing now plays a critical role in enabling use cases for ultra-reliable low-latency communication in industrial manufacturing, telecom, and a variety of other sectors.

The key benefit of edge computing remains in the ability to move workloads from devices into the cloud, where resources are less expensive and it is easier to process and store large volumes of data. At the same time, it optimizes the latency and reliability by and between applications and users thus helps us achieve significant savings in network resources by re-locating certain application components at the edge, close to the user devices. To efficiently meet application and service needs for low latency, reliability, and isolation, edge clouds are typically located at the boundary between access networks or on-premises for local deployments. Like Edge computing in telecom, is often referred to as Mobile Edge Computing, MEC, or Multi-Access Edge Computing, that provides execution resources (compute and storage) for applications close to the end-users, typically within or at the boundary of operator networks expanding the perimeter of applications erstwhile residing at the core.

The key drivers that can help us visualize the Edge Computing ecosystem are application design trends, life-cycle management, and platform capabilities, further expectations are on the management and orchestration thereof. Kubernetes has become the platform of choice for container-based, cloud-native applications that can easily be stretched to the edge without and additional effort of redevelopment and deployment, is now becoming the first choice in both the telecom industry as well as for general-purpose edge services. Most of the telecom edge workloads, as well as non-telecom edge workloads, delegate some life-cycle management functionality to Kubernetes, thus reducing complexity in those management systems and improving success criteria for the edge deployments. The expectation is to have any edge-computing solution that must be able to manage many distributed edge sites that each have their own needs based on local usage patterns and improved user experience.

On the Industrial side of Edge, the first impression that comes to our mind is self-driving cars or connected cars, the ability to put the compute capability on a moving object such as vehicles is the testimonial for Industrial Edge Computing strategy, teaming up with AI/Machine Learning and IoT technologies the Edge is redefining the way we consume the edge stretching the objectives. There is umpteen number of use cases that can be deployed on Edge for smart factory, industrial automation, robotics, smart building, wearable devices, and application delivery networks, and even Web services for static content, etc.

Looking forward to the Smart Edge

In summary, Edge computing is becoming a more powerful and promising opportunity to turn AI data into real-time value across almost every industry. The intelligent edge is the next stage in the evolution and success of AI technology, This excitement comes from the expanding Edge of awareness and reflection of the intent oblivious to the underlying hardware, software ecosystems necessary to make it work. The adoption is easier for newer entrants in the application space but poses rework on engineering the legacy application code making it ready in line with distributed computing principle and related technology stack upgrades, the responses are encouraging and many are tasting the water as a paradigm shift.

The advancement of Cognitive AI and machine learning is providing numerous opportunities to create smart devices that are contextually aware of their environment and respond to the triggers associated with real-time events rather than manual interventions. The aspirations placed on these smart things will benefit us the growth in multi-sensory data with computation via cognition with greater precision and performance., edge computing will be cutting edge for the future, isn’t it?

September 2021. Compilation from various publicly available internet sources, author’s views are personal.





***