Executive Viewpoint: Migrating to the Intelligent Edge – Embedded Computing Design

Posted: January 25, 2020 at 2:21 pm

The Intelligent Edge is gaining momentum in the embedded development community and beyond. To get a better understanding of what that means exactly, Embedded Computing Design spent some time with Wind Rivers Gareth Noyes, the companys Chief Strategy Officer.

Noyes: The simple answer is that an Intelligent Edgeis a place that spans from the device edge to the infrastructure edge where data is both collected and analyzed. This means that data is processed, analyzed and acted on before it is sent to the cloud. The result of the Intelligent Edgeis that costs are lower, network impact and latencies are minimized, and security risks are reduced. Ultimately, business is conducted in a more efficient manner.

Noyes: Companies that rely on highly centralized cloud and hybrid-cloud infrastructures realize the importance of the edge and the value it offers. Growing quantities of raw data now originate from IoT devices and sensors, and new classes of devicesfrom autonomous vehicles to industrial robotsrequire real-time access to operational data. In other words, data is being both generated and consumed at the edge, far from the centralized computing power of cloud-scale infrastructure. For all that data to make round trips to the cloud would devour too much bandwidth and take too much time for the operational needs of the edge devices. Thats why edge computing is becoming so important.

Noyes: The border between things (the edge) and the cloud is not easy to cross. Software development in the two worlds requires different skillsets, and largely uses different tools, languages, and methods. Expectations diverge about how long a development project will take and what its lifecycle should be.

And, cloud solutions dont transport easily to the edge. Companies run into unfamiliar and incompatible environments when they try to move processing closer to the data at the edge. The reasons for this are that edge computing resources are more limited, physical access and security present new challenges, and virtualization is not the norm. Resiliency, quality of service, and high availability are cornerstone data center requirements, but they can be far more challenging at the edge. Moreover, devices at the edge today are not standardized and interchangeable like servers in a data center. All this can make the edge a hostile environment for the cloud-native way of doing things.

Noyes: Edge computing devices that interact with real-time embedded systems can have specific requirements that are unfamiliar to cloud computing. Determinism is a good example. The deterministic model at the root of embedded systems is the expectation that a device will always complete the same task in the same way and in the same amount of time. Anything less than 100% determinism can result in catastrophic failure.

This is foreign territory to the data center, which is all about parallel processes that typically complete their tasks within the target timeframe. In the data center, you expect and accept a certain amount of jitter, a long tail of latency with outliers that miss the target. Deterministic embedded devices at the edge dont have this tolerance.

Noyes: Embedded systems have historically been about fixed-function, monolithic, cost-sensitive, compute-limited physical devices with long development cycles and long lifecycles. Embedded device builders struggle with how to accelerate development, scale production, and reimagine edge devices more broadly to be relevant and valuable to the world of cloud-scale infrastructure.

The pressure for device original equipment manufacturers (OEMs) to change comes both from above and below. Below are the hardware platforms that devices are built on. The availability and affordability of powerful new processing and storage options challenges device makers to take advantage of these capabilities. Today, it doesnt make sense to spend tremendous effort optimizing a custom, single-purpose piece of software to run on a specific processor. And with the advent of todays fast, multi-core, power-efficient CPUs, thats not necessary.

The pressure from above comes from customers who want flexible, multipurpose, interchangeable edge devices that are compatible with their cloud-like infrastructures. Cloud computing is expanding from the data center out to the edge, and it needs a place to land there. Pressure from above is also coming from the cloud developers who want to run their applications on edge devices. They dont want to have to learn new development languages or worry about the constrains of the system.

Noyes: Sure. We look at the landing zone concept as enabling the development of applications that can be deployed anywhere in the intelligent edge, irrespective of whether its a physical (embedded) edge device or part of the virtualized cloud-scale infrastructure.

Wind River has deep and broad experience delivering solutions for the intelligent edge. We understand embedded systems that make up the intelligent edge, we have experience deploying robust cloud-scale edge infrastructure, and we know how they can become more compatible with cloud-native applications. To create a landing zone architecture, OEMs will need to build new kinds of edge devices, consisting of a few critical, yet standard, building blocks enabling them to leverage the same modern application development processes to deploy new services at the edge.

For example, new devices and systems must decouple monolithic embedded systems with layers of abstraction, and there must be containers for traditional real-time operating-system (RTOS) applications and for a new class of cloud-native edge applications. Its also a good idea to utilize Agile development practices. As embedded development moves to a foundation of DevSecOps and continuous integration/continuous delivery (CICD), the skills gapand cultural gapwill narrow between OEMs and the customers they serve.

Noyes: Wind River is the only company with a robust and comprehensive embedded software portfolio to deliver this vision. We can help bring embedded OEMs and cloud-native industries together to architect a new intelligent edge infrastructure. This would be accomplished through a variety of software and tools already in the Wind River portfolio, including the VxWorks RTOS, Wind River Linux, the Helix Virtualization Platform that allows VxWorks and Linux to run concurrently with or without containers, Wind River Cloud Platform, Wind Rivers family of products for cloud-scale infrastructure, and Simics for system simulation.

Follow this link:

Executive Viewpoint: Migrating to the Intelligent Edge - Embedded Computing Design

Related Posts