Storage server

How Kubernetes Could Underpin Edge Computing Platforms

The container orchestration engine has potential as an edge solution, but much is left to be solved before it can become one.

By now, you’ve likely heard all about how Kubernetes can simplify the deployment of applications at scale in traditional on-prem and cloud environments. But what about edge architectures that combine central data centers with workloads hosted at edge locations that are closer to end-users? Can Kubernetes underpin an edge computing platform, too?

The answer is maybe. Although Kubernetes has potential as an edge computing solution, Kubernetes developers and users will need to overcome some hurdles before K8s is really ready for the edge.

How Kubernetes Can Help Edge Computing Platforms

Some observers believe Kubernetes is ready for primetime as a way to host applications in edge environments already. Ammar Naqvi of Canonical, for example, writes that Kubernetes is a “key ingredient in edge computing.” The Cloud Native Computing Foundation, too, promotes Kubernetes as an edge solution. It also sponsors KubeEdge, an entire open source platform devoted to Kubernetes-based edge deployments that became a CNCF incubating project last November.

These organizations are a bit biased, of course. Canonical and the CNCF are deeply invested in Kubernetes and stand to gain if Kubernetes becomes popular as a solution for building edge computing platforms.

Nonetheless, they make some good points about why Kubernetes may be a natural fit for the edge. As they write, Kubernetes provides a universal control plane that can work with any type of underlying edge infrastructure, which means it would simplify the deployment and management of workloads across diverse edge environments. Kubernetes is also great at balancing traffic and minimizing latency, which are priorities for edge workloads. And, by serving as the deployment environment for DevOps CI/CD pipelines, Kubernetes would make it easy for developers to roll out continuous updates to edge applications.

Beyond vendors and developers with an explicit interest in promoting Kubernetes, there is evidence that more neutral groups also see Kubernetes as an important part of future edge computing platforms. In a report based on interviews with more than a dozen thought leaders and vendors in the cloud computing and data center industries, Omdia found that more than half are currently seeing Kubernetes being deployed for edge workloads.

“Kubernetes is set to be the technology that enables management of workloads from Cloud to the edge providing a common application management platform,” according to Omdia’s report.

In short, there’s decent reason to believe that Kubernetes will become an increasingly important part of edge computing platforms over the next several years.

Kubernetes Edge Challenges

For Kubernetes to become a truly seamless solution for managing applications deployed at the edge, however, developers need to address several challenges.

Probably the biggest is ensuring low-latency data transmission between central data centers and edge locations. In many ways, moving data quickly is the greatest challenge in edge computing; application orchestration is a secondary issue. And Kubernetes itself does not optimize data transfer. Data fabrics do, but Kubernetes doesn’t integrate with those in a particular way.

To ensure low-latency data movement within Kubernetes-based edge environments, then, developers building edge computing platforms will need to make it easier to deploy Kubernetes in conjunction with data fabric solutions. They may also need to improve the way that Kubernetes handles internal data movement by, for example, making it easier to tell Kubernetes which internal traffic to prioritize. Currently, Kubernetes is great at figuring out how to balance incoming traffic from external endpoints, but not necessarily so great at managing internal traffic flow when every millisecond counts.

Along similar lines, Kubernetes would benefit from stronger controls over workload placement. Kubernetes gives admins the ability to assign applications to individual nodes, which works well when all the nodes are running within a single data center.

But what if you have nodes spread across multiple edge locations, and likely in a central data center as well? In that context, you’d want the ability to manage which individual edge locations host which applications, and how traffic is balanced across them. That would be complicated to do in Kubernetes today unless you have just one node on each edge location, which is unlikely. What Kubernetes needs, then, is a feature that lets admins define how workloads should be placed and prioritized based on pools of nodes in different geographic locations, rather than just individual nodes that are all running in the same physical place.

Multi-cluster management remains a challenge, too. Initially, Kubernetes was more or less designed with the assumption that each organization would run one cluster, possibly split into multiple namespaces. That has changed over the years, and most Kubernetes vendors now support multi-cluster management. But they don’t do it equally well, and managing workloads across multiple clusters arguably remains a secondary consideration for Kubernetes developers.

That’s a problem for edge computing platforms, where organizations might choose to run a separate cluster in each edge location in order to isolate workloads and simplify the management of truly large-scale environments.

Conclusion

Kubernetes has a lot to offer as the foundation for edge computing platforms, but it also has some significant shortcomings in this respect. It’s no surprise that some architects see Kubernetes as a poor fit for some types of workloads that are likely to be deployed at the edge, like containerized telco applications.

On the other hand, Kubernetes’s challenges as an underpinning of an edge computing platform are eminently solvable, given enough development effort. The real question we should be asking, perhaps, is not whether Kubernetes is capable of working at the edge (for now, the answer is that it only sort of works as an edge solution), but whether developers are willing to invest the effort required to make it a full-fledged edge computing platform.

TAGS: Containers
Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish