There was a time when the IT world was neatly divided into distinct realms, each dominated by a different operating system: Linux, Windows, OS X, Solaris, the BSDs and so on. That time is over. Today, virtualization, containers, the cloud and other innovations have blurred the technical boundaries between different operating systems and the software that runs on them.
That begs the question: Do IT pros still need to specialize in one operating system or another? Or are our skill sets OS-agnostic?
Let's consider those questions in the context of today's IT "post-OS" world.
How Operating Systems Used to Matter
If you spent any time managing servers or workstations prior to about a decade ago, you probably specialized in one type of operating system. That made sense; the skills required to administer, say, a Windows Server 2003 system were quite different than those associated with managing Red Hat Enterprise Linux (RHEL).
Not only did each system have a totally different file system, access control framework, software management tools and networking configuration, but most of the programs that you'd install on one system couldn't run on the other. On Windows, you'd be dealing with IIS Web servers, whereas on RHEL you'd most likely be working with Apache if you had to host websites.
The programming languages associated with the respective systems varied, too. If you wrote applications for Windows, you probably worked with .NET. Linux developers were more likely to specialize in languages like PHP, Python or C. Even though these languages were cross-platform, each of them tended to be much more popular within one operating system community than another.
The same sorts of differences held true no matter which operating systems you were comparing. The Windows vs. Linux debate was the sharpest divide, but folks who specialized in one of those systems rarely had deep expertise in other operating systems, like OS X or Solaris--which, despite being Unix-like systems, had relatively little in common with Linux when you got down to package managers, system configuration tools, and so on.
How Operating Systems Became Less Important
Fast forward to the present, and a series of technological changes that began about a decade ago have made the distinction between different operating system platforms much less important for IT professionals.
The most obvious change is the widespread adoption of virtualization. Virtualization tools have erased the distinctions between operating systems by making it not only possible to host one type of OS on another, but also to distribute pre-built operating system images easily. In an age when you can download and spin up a pre-configured virtual disk image using VMware, KVM or whatever hypervisor you prefer, knowing the ins and outs of the operating system you have to run is less important because you don't have to do much configuration yourself.
The cloud has made operating systems less important in obvious ways, too. In the cloud, with just a few clicks you can spin up a virtual server running pretty much any flavor of mainstream operating system you choose. You can also often install whichever applications you need using pre-configured scripts. Plus, newer cloud technologies, such as serverless computing, make the operating system disappear entirely, at least from the user's perspective.
Indeed, today, having expertise in one type of cloud platform--such as AWS, Azure or Rackspace--is arguably more important than being an expert in a specific type of operating system.
The operating system story is somewhat more complicated with Docker containers. Originally, Docker ran only on Linux. That was one key distinction between Docker containers and traditional virtualization. However, this distinction is now less important because Docker's Moby project and LinuxKit tool have made it possible basically to run a containerized application on any type of operating system. In addition, Docker now works natively on certain versions of Windows.
Where OS Skills Still Matter: Mobile, Workstations and Security
By and large, then, specialization in one type of operating system is much less important today than it was a decade ago. Understanding how to manage popular cloud platforms and virtualization tools is more useful for IT pros than knowing how to compile a Linux kernel from source or resurrect an ailing NTFS file system.
That said, there are still reasons why you might want to invest in gaining expertise in certain operating systems. One is that the distinction among different platforms remains important in the mobile world. If you are a mobile app developer, or you administer mobile software, you'll find that the divide between iOS and Android remains stark. These operating systems are designed quite differently, their management tools are quite dissimilar, and the programming languages used on each system are different in most cases.
Operating system differences still matter on desktops, too. Mac acolytes might be loathe to admit it, but the fact is that Windows continues to dominate workstations in most businesses. If your job involves administering workstations, you'll probably need to learn a lot more about Windows in particular than you would if you handle servers or cloud-based applications.
Finally, acquiring expertise in one type of system is helpful if IT security is a major part of your job. Many security breaches remain OS-specific. Understanding the nuances of various operating systems--how they handle security updates, which tools are available to help mitigate the risk of buffer overflow attacks, how you can lock down access control for users and file systems, and so on--is still essential if your job is to help defend against cyberattacks.
For most of us, becoming gurus in one type of operating system or another is no longer as important as it once was. But it still matters a lot in certain lines of IT work. If you want to pursue specific types of positions, taking the time to teach yourself the ins and outs of particular operating systems might be essential.