The What, Why and How of On-Premises Serverless Computing

Although most serverless solutions run in the cloud, it is possible--and, at times, preferable--to run on-premises serverless computing for application deployment.

ITPro Today

January 1, 2019

5 Min Read
Server

Since the introduction of AWS Lambda in 2014, serverless computing has become the next big thing in the world of cloud-based application deployment--due especially to the cost savings that serverless can provide. Yet, here’s a fact that can be easy to forget amid all the excitement about serverless platforms: Although most serverless solutions available today run in the cloud, it’s possible to do serverless on-premises, too.

Let’s take a look at why you’d want to run serverless functions on-premises and how you can do it.

What Is Serverless?

In a nutshell, serverless computing is a type of application deployment architecture that allows developers to write code and execute it on-demand. From the perspective of developers, serverless makes it possible to deploy application code without having to set up and run a server to host the code. That saves time.

In the cloud, serverless can also save money because you typically pay only for the time during which your functions are actually running--as opposed to having to pay constantly to keep a cloud-based virtual server running, even if the server is idle.

What Does Serverless Have to Do with the Cloud?

In most of the popular serverless frameworks that have emerged to date--such as AWS Lambda, Azure Functions and Google Cloud Functions--serverless application code is uploaded to a cloud-based serverless environment and run inside that environment whenever it is triggered. That’s why you sometimes hear serverless described as functions-as-a-service, or FaaS.

What is On-Premises Serverless?

Because cloud-based FaaS environments have dominated the serverless market (and conversation), many IT pros tend to see the cloud as a key ingredient for building serverless architectures. However, it's possible to run serverless functions on-demand using local, on-premises infrastructure, rather than a cloud-based service.

Doing so would not count as FaaS, per se, because you wouldn’t be running serverless functions as a hosted service in the cloud. But you don’t need your functions to be hosted on someone else’s infrastructure to do serverless. You could instead deploy application code on a local server running in your own data center, then execute that code on-demand in response to external events or triggers. This is precisely what on-premises serverless entails.

The Benefits of On-Premises Serverless

At this point, you may be thinking: “If I have to host serverless functions on servers that I maintain myself, what’s the point of on-premises serverless? How does it help me if I still have to set up servers to host my serverless functions?”

While it’s true that being able to execute application code on demand without having to manage a server is one benefit of cloud-based serverless computing, that is not the only advantage. There are other benefits of serverless that you can reap even in an on-premises serverless environment, including:

  • Improved infrastructure efficiency: If you have a lot of functions to run, but don’t run all of them at the same time, an on-premises serverless environment could help you to get more value out of your infrastructure by hosting many functions on a single server. In other words, you can fit more applications on fewer host servers and avoid under-utilization of your infrastructure. In this way, on-premises serverless offers a more efficient use of infrastructure than having to dedicate an entire physical or virtual server to each application that you run, and keep that server running constantly even if the application it hosts is only active some of the time.

  • Simplicity gains: On-premises serverless frameworks can help to simplify what would otherwise be complex infrastructure and software stacks. That’s because, once you set up a serverless environment and deploy functions into it, you can trigger any of those functions easily in response to external events using the generic serverless interface. You don’t have to worry about implementing independent triggers for each type of function, or creating a custom interface between the function you want to execute and the event that should trigger it. Your serverless framework provides this interface for you, thereby abstracting away much of the complexity separating your functions from the events that trigger them.

  • Special hardware features: A less common use case for on-premises serverless, but one that could prove particularly beneficial in certain circumstances, is to take advantage of specific hardware features--such as offloading computation to GPUs. This type of operation would be impossible to perform in most cloud-based serverless environments because functions in those environments lack direct access to physical hardware. But when you run serverless functions on-premises, you have full control over which hardware features your serverless code is allowed to access.

On-Premises Serverless Platforms

There are two categories of production-ready platforms for on-premises serverless computing that have emerged to date. The first consists of platforms that use Kubernetes as a basis for deploying serverless functions in an on-premises environment. (In many cases, these platforms work in the cloud, too.) Examples include OpenFaaS and Kubeless.

The second category is stand-alone serverless platforms that don’t depend on another framework. Apache OpenWhisk is the best-known tool in this category. Introduced by IBM, OpenWhisk is the basis for IBM’s cloud-based FaaS offering, but OpenWhisk is an open source tool that can be deployed using on-premises infrastructure, as well.

Serverless is a promising architecture for saving time and money when deploying applications, but companies shouldn't limit themselves to thinking about serverless just in the cloud. On-premises platforms have a great deal of potential, depending on the company, industry and workload, among other considerations.

Read more about:

Serverless Computing
Sign up for the ITPro Today newsletter
Stay on top of the IT universe with commentary, news analysis, how-to's, and tips delivered to your inbox daily.

You May Also Like