You may have heard about the latest flare-up in the tech world, when analysts at Gartner last week claimed that Windows is collapsing under its own weight and needs to be rebuilt from scratch. I found this argument and the ensuing online debate amusing for several reasons. First, I had suggested a similar Windows overhaul years ago--long-time UPDATE readers may recall a little gem titled "Maybe It's Time for a New Platform" (See URL below) from April 2002--though that assertion was made in the wake of the enormous security vulnerabilities that dogged Windows over the previous year. More important, perhaps, I think Microsoft has made significant architectural changes to Windows over the ensuing years, obviating my 2002 argument and, quite frankly, making the latest Gartner analysis even more lacking.
I've written a lengthy blog post about this issue (see the URL at the bottom of this article), so I won't rehash the details here. Instead, I'd like to focus this conversation where it belongs, with you, the people who deploy, manage, and support Windows every day. And I'd like to deconstruct Gartner's argument in light of your needs. Tell me if I'm getting this right, please.
Gartner says that Windows' failings shine through in three key areas: legacy support (i.e., compatibility), increased complexity, and hardware compatibility. Hypocritically, the Gartner analysts then suggested that because of evolving demands, Microsoft would be better off creating different versions of the Windows kernel to address different needs, a move that would actually increase complexity while harming compatibility. OK, fine. How does this match what's happening in the real world?
With regards to compatibility, I'd argue that Microsoft's current strategy--the drawn out obsolescence of aging technologies--has been a key driver to Microsoft's success. I don't believe there are many consumers asking that today's Windows runs a 20-year-old version of dBASE. But consumers aren't Microsoft's core market, enterprises are. And what enterprises want, among other things, are clearly defined schedules and backward compatibility. This is a market that doesn't easily part with something that just works. Microsoft has responded to that.
Microsoft is in a uniquely unwinnable position here. That is, no one is asking for new versions of Windows that do less than previous versions. And yet when the company takes big steps forward, as they did with Windows Vista, analysts complain that Windows is "bloated," as if that term actually meant anything in a world where gigabytes of RAM and ginormous hard drives are almost given away in cereal boxes. In the server world, virtualization is taking off because servers are underutilized. But when Microsoft makes a desktop OS that does more, people complain because it uses more resources. They call it bloated.
Now I get that businesses are going to stick with Windows XP for years to come simply because it runs on the now-low-end hardware that they've been using forever, and that makes plenty of sense. But does anyone with Vista experience actually believe that this system is more complex to deploy, manage, and use than XP? Different, sure. More capable, absolutely. But more complex? Really?
This one is perhaps the biggest Vista myth: Everyone's heard about the massive compatibility problems that have dogged Microsoft's latest OS. And sure enough, there have been some compatibility issues, just as there were with XP and every major Windows version before that. But here's a reality check, using actual Microsoft support statistics: By the time Vista had been on the market for 100 days, more than 96 percent of all hardware still on the market was compatible with the new OS. And the percentage of compatible hardware has only improved since then. This is far better than the situation was with XP, and, it should be noted, Vista suffered from just one-third the number of security problems as did XP in its first 100 days. I certainly don't recall any Universal Plug and Play (UPnP)-type disasters dogging Vista over the past year.
When Gartner argued that Microsoft should create different versions of the Windows kernel to address different applications, it apparently forgot or didn't realize that Windows CE, Windows Mobile, desktop and server Windows versions, Windows Embedded, and other Windows versions all address different markets and all have different kernels. (And I'm guessing they offered up no comparable criticism for Apple's decision to use its OS X OS kernel in, gasp, a smartphone.) They also apparently blacked out on the poor reception Microsoft got when it announced almost 20 different versions of Vista, all designed for different parts of the market. As far as I'm concerned, this is a marketing issue, not a technology discussion. Customers won't care what's going on underneath as long as it works. The reason Vista's different product editions don't work is because the differences between each product are too arbitrary and there are too many choices. The reason Microsoft uses different Windows kernels for different products is that sometimes it makes sense to do so. Sometimes it doesn't.
OK, I don't want this to turn into an ideological debate about Vista. But really, is anyone who implements Microsoft technologies honestly asking for the company to start over from scratch? I think the question is rendered moot by a number of concerns, and I think Gartner is full of hot air. Please tell me if I've got it all wrong.
The Great Windows Collapse of 2011 (Paul's SuperSite Blog) http://community.winsupersite.com/blogs/paul/archive/2008/04/12/the-great-windows-collapse-of-2011.aspx
Maybe It's Time for a New Platform (UPDATE, 2002)