Edge computing architecture has become central to a variety of computing tasks. Edge provides lower latency and enables data-rich experiences, but at the same time, edge interoperability and standardization are key issues. To date, devices and apps on the edge enlist a variety of protocols and interfaces, which has perpetuated a Wild West at the edge.
Edge architecture, which brings compute resources closer to the data and devices that need them, has been touted as a key paradigm beyond cloud computing. Some digital experiences require the lower latency that edge architecture can provide. At the same time, edge architecture lacks standards and common interfaces, which creates problems for devices and applications that need to interact with one another.
Efforts are under way to develop standards, interfaces, code and components that can exploit the potential of edge computing. Unfortunately, for those looking for ready-made solutions to exploit Internet of Things (IoT) devices and edge applications, it’s not clear what will work together. As a result, edge computing frameworks are plentiful—but none is the market leader yet.
Many vendors are eager to write the rules of the road for edge, as it is touted as the next big phase of computing. In just a few short years, the edge market is valued in billions of dollars (estimates for 2024 range from $9 billion to more than $28 billion.)
Further, McKinsey & Co. has identified 107 “concrete use cases” to validate its estimate that edge hardware value could reach $215 billion by 2025. Many of the anticipated applications can’t rely on the compute resources in the cloud or corporate data centers due to latency issues.
Various groups have sponsored open edge projects to standardize key aspects of the technology and provide frameworks that will simplify edge computing integration efforts. Some even offer ready-to-use software that vendors can incorporate into their products and services. Many of these projects reflect years of development work, but to corporate planners they can look like a confusing and perhaps conflicting set of frameworks.
“These efforts have some overlap, but [there are] also areas where they are complementary,” said Jim Davis, founder and principal analyst with research and advisory firm Edge Research Group. “Enterprises still either need to bring these components together or rely on vendors and integrators to help them.”
Edge open source projects embrace multiple types of implementations. These frameworks include (1) the fog computing concept of bringing cloud resources to the edge across a decentralized computing infrastructure of heterogeneous nodes; (2) Mobile/Multi-Access Computing (MEC) that incorporates wireless LTE and next-generation radio access networks into edge architectures; (3) and cloudlet computing that enables a “data center in a box” to provide resource-intensive computing capabilities at the edge, including central office mobile telecom.
“Our goal was to unify open source edge frameworks across these markets—IoT, telco, cloud and enterprise edge—both at the infrastructure and at the application level,” said Arpit Joshipura, LF Edge general manager, and added that “you don’t want fragmentation in a market this large.” Three of the key LF Edge projects are the following:
- EdgeX Foundry.Seeded initially by source code from Dell, this provides a collection of loosely coupled microservices that communicate through a common application programming interface (API) allowing them to be augmented or replaced by custom implementations. Following three previous releases in its almost three-year history, “commercial readiness” EdgeX V1.0 (Edinburgh) software was made available in mid-2019 as a stable API baseline for development of edge applications. LF Edge says several vendors have already provided solutions based on or designed around EdgeX code.
- Project EVE. The open source Edge Virtualization Engine provides a cloud-native edge computing platform. Announced in early 2019 with seed code contributed by ZEDEDA , it leverages a hypervisor for deployment on bare-metal servers and provides system and orchestration services, and a container runtime. It’s designed to host any operating system that is deployable in a virtual machine; host apps in virtual machines and containers; provide scalable centralized management of many devices over large distances; and provide built-in mesh networking capabilities and built-in cloud networking using standard VPN technologies available in public clouds.
- Akraino Edge Stack. Launched in February 2018 with initial code contributed by AT&T, this cloud software stack is optimized for edge computing systems and applications with application and network provisioning and orchestration. Release 1 was introduced mid-2019 and includes 10 “ready and proven blueprints,” including Radio Edge Cloud, multiple network and telco cloud models, as well as a Kubernetes-native infrastructure.
One of the Akraino blueprints represents a move to converge LF Edge efforts with OpenStack Foundation Edge Computing Group’s StarlingX, an edge computing and IoT distributed cloud platform optimized for low latency and high performance applications. Initially a proprietary product from Wind River and with code contributed by Intel, it was brought under the auspices of the OpenStack Foundation in early 2018 and is now available in a 3.0 Release.
StarlingX provides an OpenStack base layer with compute, storage and networking capabilities, along with configuration and other management functions. “StarlingX started out by taking the components of OpenStack and scaling them down instead of scaling them up,” said Jonathan Bryce, executive director of the OpenStack Foundation. Another edge-related project supported by OpenStack is Airship, which was initiated by AT&T and Dell to integrate OpenStack with the Kubernetes container orchestrator for edge data centers and telco edge central offices.
The Eclipse Foundation is another open edge organization, which recently formed the Edge Native Working Group to focus on near-term creation of an end-to-end software stack. It encompasses two existing projects with production-quality code available: Eclipse ioFog, based on efforts contributed by Edgeworx for running microservices across a distributed edge network; and Eclipse fog05, based on code developed by ADLINK to provide a decentralized infrastructure for fog and MEC implementations.
Another, longer-running, project under Eclipse is Kura, now available in a 4.1.0 version that provides an extensible open source IoT framework that provides API access to hardware interfaces of IoT gateways.
There’s plenty of overlap within all these efforts. Kilton Hopkins, Edgeworx CEO and Eclipse ioFog project lead, says the group plans to roll out reference architectures and working examples that will illustrate how implementers can fit various open edge components to build a full solution that is ready for use. “That might include ioFog, EdgeX and maybe some of the things that are in the Akraino stack,” Hopkins explained. “The edge industry is now mature enough that we’re able to do that.”
Or, perhaps some of these efforts will fall by the wayside or be subsumed by other efforts. Meanwhile, there are numerous similar or overlapping projects such as CORD (Central Office re-architected as a Datacenter) and Virtual Central Office (vCO), and KubeEdge for extending containerized application orchestration to hosts at the edge, among others.
“With edge, you’re going to need to have different approaches, but right now there’s just too many,” said Monica Paolini, founder and president of analyst and consulting firm Senza Fili.
Then there are the two elephants in the room: Amazon Web Services (AWS) and Microsoft Azure. AWS IoT Greengrass extends AWS to edge devices and Microsoft is pushing Azure IoT Edge as a managed service to deploy cloud workloads on edge devices. Each has a big stake in ensuring that edge environments don’t syphon off too much of the data compute and storage services that are driving their cloud-based revenue streams.
While edge frameworks remain a fragmented field, history suggests that the market will coalesce around a few. It may take some time for standardization to happen, but this foundation will enable edge architecture to do what it’s there to do: make compute experiences faster, more integrated and more data-intensive no matter the location.