Nearly a decade ago, the advent of cloud-native computing technologies transformed the way organizations built and deployed software. By enabling more agile and scalable applications, cloud-native technologies like Docker and Kubernetes helped businesses move faster and take innovation to a new level.
Today, we're witnessing the emergence of a similar revolution in IT technology. I like to call it event-native computing. Like cloud-native, event-native represents a radically transformative paradigm that promises to disrupt how businesses of all types leverage technology.
To prove the point, allow me to discuss what event-native computing means and why getting on board with an event-native approach to software and data management is poised to be so critical for businesses that want to keep innovating at a rapid pace.
What Is Event-Native Computing?
Event-native computing is the use of event-driven data streams as the foundation of IT operations. With event-native computing, organizations can make decisions in real time based on continuously updated and streamed data. They can detect problems with software applications instantaneously using monitoring tools that ingest streaming metrics, for example, or detect fraudulent sales transactions based on real-time analysis of streaming sales data.
Event-native computing is the opposite of conventional approaches in which businesses analyzed data in batches. Instead of monitoring your software applications in real time, for example, a pre-event native approach might involve polling the applications once every 5 or 10 minutes to check on their status. Or, it could entail detecting fraud by analyzing batches of sales transactions after the transactions are complete, rather than performing the analysis in real time.
As I implied above, there's an analogy between event-native computing and cloud-native computing. Cloud-native refers to the adoption of technologies that enable the deployment of highly scalable applications in distributed, cloud-based environments. It's the opposite of relying on tightly coupled, monolithic application architectures and deployment technologies that are difficult to scale or update.
In a similar fashion, event-native computing replaces synchronous, inflexible decision-making with a highly agile approach to data analysis and response. With event-native computing, organizations can detect and react to important events instantaneously, since they are collecting and analyzing real-time data streams.
Streaming Data as a Key to Business Success
To be sure, many of the technologies that enable event-native computing are not totally new. Event streaming platforms like Kafka and RabbitMQ have been around for years, and it's not exactly uncommon for organizations to analyze at least some of their data in real time in order to make event-driven decisions.
Going forward, however, leveraging data streams to adopt an event-native approach is poised to become the de facto approach to decision-making, rather than the exception, for successful businesses. A recent Confluent survey found that data streaming increases ROI by a factor of between 2 and 5 for a majority of businesses. And as Forbes points out, "modern business runs on data streaming."
This means that if event streams aren't already at the center of your organization's workflows, now is the time to get them there. The ability to react to events in real time will likely become a key differentiator for high-performing businesses — just as the adoption of cloud-native technology also helps organizations operate more efficiently and with greater agility.
How do you actually become event-native? The answer starts with adopting technologies capable of exposing data as streams of events. Again, those technologies have existed for a while, and there are a variety of both open source and commercial solutions to choose from.
But simply exposing data as event streams is only the first step. You also need a way of allowing applications to consume those event streams securely and in real time, and that's where matters can get tricky. Most event-streaming platforms use their own protocols, and some communicate using messages. This means that generic applications designed to communicate over the HTTP protocol using REST APIs, or that leverage more common asynchronous client-side technologies such as WebSockets, can't interface directly with many event brokers.
One way to solve this challenge is to build custom logic and data pipelines for translating between protocols directly into your applications. But that would be a tremendous amount of effort. And you'd have to repeat the process for every application that you want to connect to an event broker, as well as whenever anything changes on either the client or event broker sides.
A better, more efficient approach is to deploy event-native API management software as flexible middleware that can act as a proxy layer between event streams and applications that need to consume and analyze that data. With an API gateway that can expose backend data over HTTP and other popular client-side protocols and API types, you can connect any application to any event stream in a simple and secure way.
Conclusion: Embracing the Data Streaming Future
We have the technology to enable event-native approaches to computing and business. For the typical organization today, moving to an event-native strategy is simply a matter of putting the right tools into place and learning how to leverage them to maximum effect. Just as organizations seeking to maximize efficiency and agility adopted cloud-native technologies starting about 10 years ago, organizations that want to make decisions and process information in real time must now shift to an event-native technology stack.
About the author: Rory Blundell is the CEO of Gravitee.