Fastly Expands Cloud Serverless Edge Efforts

Fastly is now previewing its Compute@Edge service, powered by the company's Lucet WebAssembly technology, enabling developers to use multiple languages for cloud serverless functions.

Sean Michael Kerner, Contributor

November 19, 2019

3 Min Read
serverless computing
Getty Images

Fastly has been building out its content delivery network cloud platform since 2011. Over that time, there have been many rounds of innovation for both Fastly and the broader cloud market.

Two of the hottest trends in cloud computing today are edge computing and serverless, which Fastly is now combining with its Compute@Edge service. With Compute@Edge, Fastly is also promising a language-agnostic compute environment, where developers can choose to deploy cloud serverless functions written in nearly any programming language with the use of WebAssembly (Wasm).

Serverless at the Edge

Serverless is an approach where functions are run as a service, rather than requiring a long-running container or virtual machine. The promise of serverless is more efficient use of resources and lower costs for users. Amazon Web Services (AWS) helped pioneer the cloud serverless space with its Lambda service that runs in its public cloud. Fastly's Compute@Edge is different in that it runs at the edge.

"We run our functions closest to the end user, which is how we define the edge," Tyler McMullen, CTO of Fastly, told ITPro Today.

Compute@Edge isn't Fastly's first foray into the serverless world. However, its previous serverless computing offerings were focused on specific use cases and made use of Fastly's VCL (Varnish Configuration Language)-powered programmable edge compute technology, according to McMullen. Varnish is an open-source technology for web caching and acceleration.

With the introduction of Compute@Edge, Fastly will begin offering more general computing capabilities for the serverless computing market, McMullen said.

"We see this as both a more powerful programming and runtime environment for our existing customers, who are doing amazing use cases at the edge, … [as well as] serverless developers that are focused more on functions as a service [FaaS], who are looking for something with far greater performance," he said.

Fastly claims its new service can achieve a serverless startup time of only 35.4 microseconds. McMullen said the Fastly development team focused a tremendous amount of time on building a fast and safe model.

"We built a new compiler and runtime that moves much of the work that older solutions do at runtime into the compiler," he said. "The tight integration between the compiler and the runtime means that starting a sandbox is nearly as simple as pointing at a particular memory slot and jumping into the code."

WebAssembly

Fastly built the new service in a way that enables developers to use the language of their choice, McMullen said. A core part of that is enabled with Fastly's WebAssembly compiler known as Lucet. With WebAssembly, other programming languages can be abstracted and then run as bytecode on top of a runtime engine.

"Lucet is the engine on which Compute@Edge runs," he said. "Because it’s our open-sourced WebAssembly runtime and compiler, it also has the ability to support multiple languages."

Fastly is a big supporter of WebAssembly and is one of the founding members of the Bytecode Alliance, whose primary goal is to advance the state of WebAssembly technology and usage. Other founding members of the Bytecode Alliance, which was announced on Nov. 12, include Mozilla, Intel and Red Hat.

Compute@Edge is currently available as a private beta preview, with plans for a public preview to follow in the coming months.

Read more about:

Serverless Computing

About the Author

Sean Michael Kerner

Contributor

Sean Michael Kerner is an IT consultant, technology enthusiast and tinkerer. He consults to industry and media organizations on technology issues.

https://www.linkedin.com/in/seanmkerner/

Sign up for the ITPro Today newsletter
Stay on top of the IT universe with commentary, news analysis, how-to's, and tips delivered to your inbox daily.

You May Also Like