Once upon a time, companies were pretty much on their own when it came to figuring out the hardware and software logistics for how to get an edge deployment up and running. At a press-only trends discussion panel centered around edge computing at the recent Red Hat Summit, the panelists made it clear that those days are rapidly passing. It’s easy to provision edge deployments now.
"Until very recently, we've seen early adopters in edge computing building their own bespoke solution," said Frank Zdarsky, a senior principal software engineer with Red Hat. "As part of this, they had to figure out some of the really fundamental problems of not only how to deploy compute and storage at the edge, but also how do you secure it? How do you do software updates? How do you manage at scale? These are all hard problems that were binding significant resources for them."
Those problems have been largely addressed with edge solutions and services. Edge deployments have become commonplace enough that practically anything a company needs to accomplish has already been done by others, with best practices and software already developed. Even better, there is now an ecosystem of infrastructure companies that can offer expertise and guidance in all stages of staking out an edge location.
"Today, we're at a place where whatever your edge computing use case and your environmental constraints, there exists an off-the-shelf solution from companies like Red Hat that are open source and that are exactly right for your specific use case," said Zdarsky. "That elevates the discussion, because it means that it enables customers to actually focus on their business problems and on their business needs rather than potential constraints of a technology."
This takes the conversation with customers from "how can I do it" to thinking about ways of improving the edge infrastructure they've just built to improve customer experience through things like harvesting more data and acting on it locally, he said.
Nick Barcet, Red Hat's senior director of technology strategy, made a similar observation about edge infrastructures during the panel discussion at Red Hat Summit:
"What's really interesting is that sometimes when we deploy an infrastructure to serve an edge purpose, suddenly a secondary topic comes into play and we are solving two problems with one edge infrastructure," he said. "This is something that is key, because if you can combine these infrastructures for multiple environments, you're saving a lot on cost, and this is where the open hybrid cloud becomes absolutely essential."
Red Hat at the Edge
Red Hat sees edge computing as a natural extension of the open hybrid cloud approach it's advocated since the cloud first began rising to prominence, in part because the move to edge deployments is only another step for the technologies and practices it's already developed for extending the on-premises data center to the cloud.
While this takes Red Hat engineers out of the familiar and predictable environment of data centers, it's also opened horizons, as edge is an environment that produces new, never-seen-before use cases.
"The driving force for edge computing right now is about how people are going to modernize their environment," Barcet said. "It can be in manufacturing, with people wanting to produce better, higher quality product with less downtime, or, in transport, it might be providing new services or better tracking or better understanding or less fuel consumption. There is not one reason to implement edge; there is a multitude of reasons."
Edge has also changed the edge economy, for example by forcing companies such as Red Hat to rethink their pricing when it comes to edge deployments.
"We view edge as a part of an open hybrid cloud deployment so [pricing] is not separate, it's kind of one deployment within a hybrid cloud," said Stefanie Chiras, senior VP and general manager of Red Hat Enterprise Linux. "That being said, the model is different with respect to the scale to which you deploy, the number that you deploy, etc., so we have been looking at different pricing models.
"Once you get to device edge, we have alternative pricing models to try and match the business model that customers are looking for, that is different from what they deploy in their data center," she added. "It's kind of specific to the use case way out at the device edge. If folks are looking to deploy at scale with those broad kind of use cases out there at device edge, definitely speak to your Red Hat sales rep."
The Rapid Rise of the Edge
While the edge began gaining traction in the enterprise upon the advent of cloud and cloud native technologies, the growth of edge deployments during the pandemic seems to be exceptional.
Changes brought about by the pandemic definitely played a role in that growth, Chiras said, but added that the arrival of COVID-19 didn't so much change Red Hat's customers' strategies as accelerate them.
"If they were on a strategy to adopt more public cloud usage, it accelerated that strategy," she said. "If they were on a strategy to make sure that they increased resiliency and security within their deployment, they focused on that. I would say it wasn't a time of changing strategy, it was definitely a time of accelerating execution of strategy."
Chiras indicated that she thinks that other factors were coming together, and the edge would have seen rapid growth over the last year with or without the pandemic.
"When I look at it, it comes to a few things," she said, and rattled off a list of recent technology advances that include advances in server technology (which includes increased performance from power sipping CPU architectures like Arm), the ability to do advanced data management securely, containers for rapid deployment of applications, and advances in networking.
Barcet agreed, and added remote working as another factor accelerating edge deployments and growth.
"We all learned how to work remotely," he said. "I mean, at Red Hat we didn't learn anything because already 80% of our workforce was working remotely, but all of our customers who were used to going to the office everyday are now used to do conferencing. By learning to do so, we have displaced the requirements for computing. Computing can now either be in the cloud, or in or in the data center, or in the home of our engineers – and that changes everything.”