Skip navigation
Whale made of containers Getty Images

Trending Tech to Watch: Knative Kubernetes-Based Platform

There are a number of reasons to turn to Knative to manage serverless workloads on Kubernetes.

You know Kubernetes can orchestrate containerized applications. But did you know it can also manage serverless applications via Knative?

Well, it can. And there are a number of reasons – including a simpler deployment and development experience – why you may want to use Knative or a similar framework to deploy your applications as serverless workloads on Kubernetes.

What Is Knative Kubernetes-Based Platform?

Knative is an open-source platform that provides support for running serverless applications on Kubernetes. In addition, Knative includes tooling that helps automate the creation and deployment of serverless applications into Kubernetes.

Knative has been in development for about three years. But it recently reached version 1.0 status, meaning developers consider it ready for prime time.

Why Is Knative Important?

To understand why running serverless applications on Kubernetes is a big deal, you have to understand what serverless computing means and how it has traditionally been implemented.

A serverless application is one that developers or IT engineers can simply deploy into a preconfigured host environment, then let the environment determine how and when to run the application. In this way, serverless computing minimizes the effort required on the part of engineers to run an application.

Serverless applications are not very new, and serverless computing platforms are not in short supply. Serverless computing services like AWS Lambda and Google Cloud Functions have been in widespread use for the better part of a decade.

However, these and most other serverless computing platforms are tied to proprietary ecosystems. In most cases, they are part of public cloud platforms, and each cloud’s serverless solution works in a unique way. You can’t easily take serverless applications designed for AWS Lambda and run them on Google Cloud Functions, for example.

Against this backdrop, the Knative Kubernetes-based platform stands out because it offers several advantages over most other major serverless computing solutions:

  • Infrastructure agnosticism: Knative can run anywhere Kubernetes can run – which is basically anywhere, because Kubernetes supports on-premises infrastructure, hybrid clouds, private clouds and public clouds.
  • Freedom from cost: Knative is open source, and there is no fee to use it. Your only costs are whatever you pay to run the underlying Kubernetes clusters.
  • Native Kubernetes integration: Knative is an extension of Kubernetes’ native tooling (hence the name Knative), rather than a stand-alone platform. That means developers don’t need to learn a whole set of new tools and paradigms to use Knative. If they know Kubernetes, they can get started with Knative pretty easily.

Beyond these advantages over other serverless computing solutions, Knative is also important because it simplifies the overall Kubernetes experience. By automating many of the application deployment, resource provisioning and infrastructure management tasks that Kubernetes admins would otherwise have to perform manually, Knative makes it easier overall to run applications on Kubernetes than it would be using conventional containers.

How Does Knative Work?

When you use the Knative Kubernetes-based platform, your applications are still packaged as containers, just as they would be if you deploy apps to Kubernetes without using Knative.

However, Knative automates several key aspects of application deployment and Kubernetes management, including:

  • Builds: Knative includes tooling that automatically pulls source code from repositories and builds it into container images.
  • Deployment: Knative automatically deploys your container images into Kubernetes.
  • Event-based operation: Knative automatically starts applications in response to triggers that you configure in advance. It also handles routing and instance scaling to provide a smooth experience, even if application demand fluctuates wildly.

You could argue that these features mean that Knative brings Kubernetes closer to being a true platform as a service (PaaS). Instead of just orchestrating applications that engineers build and provision manually (or, at least, using tooling that is external to Kubernetes), which is what happens when you use standard Kubernetes, Knative lets you create a fully automated pipeline for deploying and running applications.

And, most impressive of all, Knative does all this using the native Kubernetes API. That means you don’t have to worry about managing one set of tools for building and deploying applications and another for orchestrating them. Everything happens in Kubernetes.

How to Get Started with Knative

To use Knative, you need a Kubernetes environment in which Knative is installed.

Some Kubernetes distributions and services offer built-in Knative integrations. Check out the Knative documentation for a complete list. If you use one of these services, you don’t have to set up Knative yourself.

Alternatively, if you use a Kubernetes environment that doesn’t feature out-of-the-box Knative support, you can install Knative yourself by using an operator or by running a few kubectl commands.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish