Edge Computing, what it is?

Introduction

Edge computing is one of the core components of the Industrial Internet of Things (IIoT).

Large enterprises from manufacturing, transportation, oil and gas, and smart buildings are relying on edge computing platforms to implement the next generation of automation.

What is Edge?

Edge computing in the context of IIoT brings compute closer to the origin of data.

Without the edge computing layer, the data acquired from assets and sensors connected to the machines and devices will be sent to a remote data center or the public/private cloud. This may result in poor performance and higher costs due to the following factors:

  • Increased latency
  • Poor data locality
  • Increased bandwidth
  • Increases data consumption costs
  • High opererational costs to store data with low value. Yes, source data from the systems is a must for predictive maintenance, diagnostics and monitoring but needs to be relevant. For instance there’s no value to send to the cloud a sensor temperature value every second

Edge computing is an intermediary between the devices and the cloud or data center. It applies business logic to the data ingested by devices while providing analytics in real-time. It acts as a conduit between the origin of the data and the cloud, which dramatically reduces the latency that may occur due to the roundtrip to the cloud.

Since the edge can process and filter the data that needs to be sent to the cloud, it also reduces the bandwidth cost. Finally, edge computing will help organizations with data locality and sovereignty through local processing and storage.

Building Blocks of the Edge

Companies considering deploying an edge computing platform must evaluate if the platform supports the essential capabilities to deliver the benefits promised by the edge.

Processor and OS

Edge computing typically runs on resource-constrained devices.

Compared to desktops and servers, edge computing devices usually have less power. They have a smaller form factor making it possible to deploy them in space-constrained environments with low cost. Since most of the System on Chip (SoC) devices run on an ARM processor, edge computing supports both x86 and ARM architectures.

Edge computing software should be available for Microsoft Windows, Linux, and Android, making it easy for customers to target existing environments running mainstream OS. In general a platform independent programming language supported by a multiplatform framework should be chosen for this propose. Python, Qt, Java, .NET, GO are possible options.

AI Accelerator & Inferencing Engine

The edge is becoming the destination for running machine learning and deep learning in an offline mode.

When compared to an enterprise data center or the public cloud infrastructure, edge computing has limited resources and computing power. When deep learning models are deployed at the edge, they don’t get the same horsepower as the public cloud, which may slow down inferencing, a process where trained models are used for classification and prediction.

To bridge the gap between the data center and edge, chip manufacturers are building a niche, purpose-built accelerator that significantly speeds up model inferencing. These modern processors assist the CPU of the edge devices by taking over the complex mathematical calculations needed for running deep learning models.

While these chips are not comparable to their counterparts – GPUs – running in the cloud, they do accelerate the inferencing process. This results in faster prediction, detection, and classification of data ingested to the edge layer.

Message Bus

A message bus acts as the conduit to access the data points and events ingested into the system. The message bus is integrated with message brokers such as MQTT, OPC-UA, message-oriented middleware, and cloud-based messaging services.

The message bus acts as a single source of truth for all the components of the edge computing platforms. Telemetry data enters the message bus through the message broker, which is processed by the CEP.

The output from CEP, machine learning inferencing, and other data processing elements is sent to the same message bus.

The message bus acts as a pipeline with multiple stages where each stage is responsible for reading, processing, and updating the data. Each component is typically associated with a topic to isolate and partition the data.

Device Management

Since the edge acts as the gateway to multiple devices, it has to track, monitor, and manage connected device fleets. Through the edge, we need to ensure that the IoT devices work properly and securely after they have been configured and deployed. The edge has to provide secure access to the connected devices, monitor health, detect and remotely troubleshoot problems, and even manage software and firmware updates.

The edge will maintain a registry of devices to manage the lifecycle. From onboarding to decommissioning, each device will be managed via edge computing’s device management service.

If the customer has existing cloud-based IoT deployments, the device management component of the edge has to integrate with it seamlessly.

Complex Event Processing (CEP) Engine

A Complex Event Processing (CEP) engine is responsible for aggregating, processing and transforming streams or events coming via the message bus in real-time. In the context of IIoT and edge, the CEP engine aggregates telemetry data ingested by a variety of sensors connected to the devices. It then normalizes and transforms the streaming data for advanced queries and analytics.

The CEP engine has visibility into every data point ingested into the edge computing layer. Apart from aggregation and processing, it can even invoke custom functions or perform inferencing based on the registered machine learning models.

Complex Event Processing is mainly used to address low latency requirements. It is usually expected to respond to events within a few milliseconds from the time that an event arrives.

Secured Access

An edge computing layer should integrate with existing LDAP and IAM systems to provide role based access control (RBAC). Each IT/OT persona should be associated with a well-defined role designated to perform a specific action. For example, an application developer role shouldn’t have permission to perform firmware upgrades.

Extensibility

For the devices deployed in an IIoT environment, an edge computing platform is a gateway and hub that needs to be integrated with different resources from the data center and public cloud. From material resource planning software to the corporate directory services to a message broker to the data lake, an edge computing platform has to integrate with a diverse set of services and applications.

Interoperability and extensibility become the key for enterprises to realize the return of investment from edge computing. An edge computing platform should be programmable to allow developers to build highly customized solutions.

Management

A typical edge computing environment may have millions of sensors and actuators connected to tens of thousands of edge devices. Managing the configuration, security, applications, machine learning models, and CEP queries can become extremely challenging.

A scalable edge computing environment demands a robust, reliable management platform that acts as the single pane of glass to control the edge devices. It becomes the unified control plane to onboard users, register edge devices, integrate cloud services and existing message brokers. It also provides a friendly user interface to manage the workflow and tasks involved in managing the environment.

Leave a Comment

Your email address will not be published. Required fields are marked *

IoT

Edge Computing, what it is?

time to read: 10 min
Scroll to Top