Containers with numbers on shelves.png

Should You Use Docker Containers on Windows? Maybe, Maybe Not

Here are some recommendations for determining when it makes sense to run Docker Containers on Windows--and when it doesn't.

Docker containers, which were once a Linux-only technology, now work on Windows, too. Should you use Windows containers? What are the benefits of running Docker containers on Windows, and when should you not use containers on Windows? Keep reading for some tips.

Containers on Windows

Docker containers are a way of running applications inside isolated, portable environments. Containers don't totally isolate applications from the host system in a way a virtual machine does, but they isolate the application enough that the host operating system's configuration doesn't impact the way the application operates (for the most part). Plus, you can optionally run Windows containers inside a dedicated Hyper-V instance to get full isolation.

Docker became available for Linux in 2013. Starting in 2016, however, Microsoft and Docker partnered to bring the same framework to Windows 10 and Windows Server 2016 and later. (Other versions of Windows are not supported.)

Reasons to use Docker on Windows

Of course, what all of this means is that you don't have to be a Linux shop to take advantage of Docker.  That's good news for Windows admins, because containers on Windows offer several significant benefits.

Maintaining parity between testing and production

For software developers and testers, ensuring that the conditions under which you write and test an application are the same as those under which it is deployed into production is an age-old challenge.

Containers help to address this problem because they package configuration variables inside the container environment. As a result, conditions on the host server don't impact the way a containerized application behaves, at least in general. Things like hardware resources on the host server still matter, but software variables, such was which service pack you have installed, don't impact the application.

Running different versions of the same application

What if you want to run multiple versions of the same application on the same host? Traditionally, doing so would be messy, to say the least, because you'd have to find a way to allow multiple versions of the application to exist in the Windows file system, while also allowing their processes to run at the same time--neither of which Windows expects you to do.

But with containers, it becomes easy to have different versions of the same application running side-by-side, without making a mess. You simply package each application into a container and run it. The application data stays neatly within the container, instead of on the host file system. The application processes are also isolated via the container.

Using the same toolset on Windows and Linux

Maybe your organization has both Windows and Linux servers. In that case, Docker can come in handy as a way to standardize your toolset for deploying application. With a few exceptions, the same Docker commands work on both Windows and Linux, meaning that you can use the same scripts and processes to deploy containers into both environments.

The caveat here is that Windows containers can't run directly on Linux, and vice versa. In other words, you can't take a container created for Linux, drop it onto a Windows server and expect it to work. But you can at least use the same tooling to deploy containers into each type of environment, thereby eliminating the need to maintain different deployment tools for your Linux and Windows servers.

Security

As noted above, you can optionally run containers on Windows in what Microsoft calls "Hyper-V mode." With this approach, each container runs inside a Hyper-V virtual machine instance.

The main advantage of Hyper-V mode is that your containers get total isolation from the host server. As a result, security problems within one container won't affect other containers. In this respect, Hyper-V containers offer a solution for increasing the security of your Windows application deployments.

Reasons Not to Use Containers on Windows

While Docker on Windows offers several advantages, it's not the best solution for every use case. Following are some considerations that can make containers a poor fit for Windows.

Accessing GUIs is somewhat tricky

Docker was designed first and foremost as a way to deploy applications that don't have a graphical interface, and instead are controlled solely through the command line. (Given Docker's origins as a Linux-only technology, this makes sense; most Linux server apps don't have graphical frontends.) Docker makes it easy to interact with your containerized applications using the CLI, but it doesn't provide a native way for accessing a containerized application's graphical interface, if the application has one.

There are workarounds. On Windows, you could use RDP, VNC or a similar protocol to export the graphical interface of a containerized application over the network, and access it that way. But this requires some extra setup, and it can increase your attack surface from a security perspective, so it's not ideal.

This is not to say you should never run an application with a graphical interface inside a container. But before you do so, make sure that the benefits of containerizing that application are worth the extra work it will take to make the interface accessible.

Not all Windows versions supported

As noted above, containers work only on Windows 10 and Windows Server 2016 and later. Microsoft has made no indication that it plans to extend container support to older versions of its operating systems.

Thus, if you have non-supported versions of Windows in your shop, containers may not be a good fit for your needs, because you won't be able to achieve the universal standardization and "deploy anywhere" functionality that containers are designed to offer.

Resource overhead and performance

You pay a price--in the form of resource overhead-to isolate applications inside containers. In other words, you have to devote a certain amount of system resources to running Docker. That leaves fewer resources available to your applications, and can degrade performance.

To be sure, the performance loss of containers is minimal in most cases, compared to what you'd face if you ran an application inside a virtual machine. (I'm unaware of any solid performance benchmarks for Windows containers, but IBM has a great study for Linux containers, the findings of which can generally be extrapolated to Windows.) But they exist, and that can be a reason to avoid Docker on Windows if you want to get as much performance out of your applications as possible.

Note, too, that application performance takes a greater hit when you run containers on Windows in Hyper-V mode, because that entails running both Docker and a virtual machine. The virtual machine instance is designed to have a minimal footprint, but it still consumes more resources than Docker alone.

Conclusion

For Windows, as for Linux, Docker containers offer new opportunities for deploying applications more easily and more nimbly. But there are some definitive trade-offs for using Docker, such as the relative difficulty of accessing an application's graphical interfaces and possible performance drawbacks. Windows containers are certainly worth a look, but they're not the perfect fit for every Windows application deployment.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish