What's the next big innovation in IT? For decades, there were clear answers to that question. There was always a revolutionary technology — like the internet, virtualization, and cloud computing — in the process of maturing.
But today, I'm not sure that there is another big IT innovation around the corner. The technologies that are heralded as game-changers tend to be variations on established themes rather than true innovations. Or, they turn out to be duds that never come close to achieving their promised goals.
That begs the question: Is the pace of IT innovation slowing? I think so. Here's why.
A Brief History of Innovation in IT
If you look back over the past several decades, it's easy to see a steady stream of major innovations that reshaped the IT ecosystem.
For example, the early 1990s were marked by widespread adoption of the internet. Although the internet itself is considerably older, the networking of most computers turned out to be a watershed moment for the IT industry.
A bit later, around the turn of the millennium, virtualization became the next game-changer in IT, allowing physical infrastructure to be sliced and diced in all sorts of innovative ways.
Then, starting in the mid-2000s, came the advent of public cloud computing, with profound ramifications for the way most workloads were deployed. And about a decade after that, the cloud-native revolution arrived, ushering in yet another wave of IT innovation.
Each of these innovations markedly transformed the way IT operations teams worked, the types of tools they used, and the types of challenges they could solve.
The Dwindling of IT Innovation
The more I look around the IT industry today, the harder I find it to see emerging innovations that are as promising as those of decades past.
Sure, there are plenty of interesting ideas that are often hailed as the next big thing in IT — such as artificial intelligence (AI), blockchain, and the metaverse, to name a few. But most of these trends are probably not going to end up being truly revolutionary, certainly not for IT practitioners.
AI — which isn't exactly new, even though folks sometimes talk as if AI is something that only emerged in the 2010s — can surely help automate certain workflows and assist engineers in sifting through complex data sets. But I highly doubt we'll ever reach a "NoOps" world, in which IT operations are fully automated via AI tools.
As for blockchain, people who still believe in its revolutionary promise are increasingly a marginalized bunch. Blockchain is a cool idea, but if it were going to change the way IT (or anything else) works, it would have done so by this point.
The same will end up being true, I think, of the metaverse, which is perhaps the latest, greatest subject of the IT innovation conversation. I'm not saying the metaverse isn't real, or that it doesn't matter. But I am saying that it's almost certainly not going to transform the way IT works in the way that technologies such as cloud computing or the internet did.
Has IT Fully Matured?
Admittedly, industry-changing technologies aren't often obvious until they have already effected major change. It's certainly possible that the next huge source of IT innovation is right out there under our noses, and we just don't see it yet.
But I doubt it. Instead, I'm inclined to believe that the pace of innovation in IT is indeed slowing down, and the reason is that the IT industry has finally matured.
If you look at the history of other industries, you'll see a similar trend. Arguably, there has been no revolutionary innovation in, say, the automotive industry in many decades. There have certainly been small innovations, like the digitization of many automotive components. But the way cars are manufactured and maintained today is not all that different from the 1960s — or the 1930s, for that matter. A four-cylinder engine is still a four-cylinder engine. A tire is still a tire. An oil change is still an oil change.
For a long time, you couldn't draw similar conclusions about IT. For instance, a server today (which is probably a VM running on a cloud IaaS service) looks vastly different — and requires a vastly different provisioning and management process — from a server in the 1990s, let alone in the 1970s, when most servers (to the extent that they existed at all) weren't even connected to the then-nascent internet.
But I think the servers of three or four decades from now will look a lot like the servers of today. So will IT workflows such as application deployment, observability, infrastructure provisioning, and so on. There is just not as much opportunity for new innovation in IT because the solutions we have today are already really good.
In that sense, the slowing pace of innovation in IT is not at all a bad thing. It's a sign that we've come a long, long way from the days when servers were time-consuming and tedious to set up, and when applications were risky to deploy and difficult to troubleshoot. We've solved the core problems of IT at this point, and future innovations will not need to be as earth-shattering as those that arrived earlier in the history of IT.
About the authorChristopher Tozzi is a technology analyst with subject matter expertise in cloud computing, application development, open source software, virtualization, containers and more. He also lectures at a major university in the Albany, New York, area. His book, “For Fun and Profit: A History of the Free and Open Source Software Revolution,” was published by MIT Press.