Server Core was introduced in Windows Server 2008 for a very specific reason. Each Windows version introduces new features that offer a richer OS than before. But these changes also make the OS bigger, with a larger footprint, larger potential attack surface, more to patch, and more associated reboots. In addition, the Windows Server and Windows Client OSs share a large amount of code, including components that might not be needed for a server (e.g., media services, Internet Explorer—IE, the graphical shell Explorer.exe). This adds even more code to the server OS but provides a consistent experience for administrators between the client and server. But when a server is a domain controller (DC), a file server, or a simple web server, all the extra features and even the nice GUI are unneeded; they just add to what needs to be patched. Enter Server Core as the preferred platform for infrastructure servers.
After logging in to Server Core in Windows Server 2008, the administrator would see a command prompt box that was used to manage the server. This was a shift in management for many Windows administrators, who were used to a graphical experience. Because Server Core in Windows Server 2008 was the minimal OS needed to run the key infrastructure roles, it omitted many components that were typically found in Windows Server. Some of those components required more frequent patching. Compared with a full installation, Server Core required, on average, half the number of patches and needed to be rebooted far less frequently. In addition, the Server Core installation had a smaller disk footprint and some reduction in CPU and memory. But the real benefit of using Server Core was the reduction in patching and the resulting improvement in server availability.
You might think, then, that Server Core would have been widely adopted. After all, who wouldn't want to halve the number of patches needed and reduce the associated management? But the other two focus areas for Windows Server 2008—namely Server Manager and Windows PowerShell—give some clues as to why very few companies adopted Server Core.
With Windows Server 2008, Microsoft introduced Server Manager. This was the new way to manage servers; it was role-focused, providing an intuitive experience for administration. However, Server Manager couldn't manage a remote machine and thus couldn't be used with Server Core. As a result, administrators were unable to use the new management direction with Server Core and instead needed to use the legacy Microsoft Management Console (MMC) tools and the command line. Also with Windows Server 2008, Microsoft pushed PowerShell as the direction for management of all things Microsoft. PowerShell was built on the .NET Framework. Because of the monolithic nature of .NET (i.e., it being one big component that couldn't be broken up) and its reliance on components that weren't available in Server Core, it was impossible to provide .NET Framework for Server Core. Therefore, PowerShell was unavailable on Server Core. In addition, many third-party management agents (and their backup, monitoring, and malware-protection products) didn't work correctly on Server Core because of dependencies on components that weren't found in that installation.
See the main article: "Windows Server 2012 Installation Options"