So before I rip into this one--and honestly, how could I do otherwise, given how wrongheaded this is?--I would like at least take a moment to note that I generally enjoy Randall Stross. This one, however, took me by surprise and I had to resist the urge to toss aside the Kindle (from which I read it this morning) and jump online ("someone's wrong on the Internet!"). But seriously. This is just idiotic. I'm sorry, but it is.
Beginning as a thin veneer for older software code, [Windows] has become an obese monolith built on an ancient frame. Adding features, plugging security holes, fixing bugs, fixing the fixes that never worked properly, all while maintaining compatibility with older software and hardware — is there anything Windows doesn’t try to do?
The best solution to the multiple woes of Windows is starting over. Completely. Now.
Vista is the equivalent, at a minimum, of Windows version 12 — preceded by 1.0, 2.0, 3.0, 3.1, NT, 95, NT 4.0, 98, 2000, ME, XP.
Except, of course, that it isn't.
Windows Vista is the latest in a line of NT-based OSes that includes just Windows NT (versions 3.1, 3.5, 3.51, 4.0), Windows 2000 (5.0), and Windows XP (5.1). (There are server derivates as well, but whatever.) The Windows 1.0, 2.0, 3.0, 3.1, 95, 98, and Me release he mentions are completely different products with different code bases.
But the assumption here, of course, is that OS X and Linux, both based on UNIX systems that actually pre-date the original version of NT are somehow "newer" or "fresher" and, equally illogically are somehow "better." UNIX is older than NT. And NT is a descendant of VMS, which was an attempt by DEC to make a better UNIX. Let's leave the architectural discussions to the experts and at least just agree that all three--Vista/Server 2008 (i.e. "Windows"), UNIX/Linux, and UNIX/OS X--are all modern, scalable, and capable OSes. Because they are.
After six years of development, the longest interval between versions in the previous 22-year history of Windows, and long enough to permit Apple to bring out three new versions of Mac OS X, Vista was introduced to consumers in January 2007.
And here we have the second bit of iCabal BS that Stross passes off as "fact." Actually, Microsoft shipped a wide number of OSes between XP (2001) and Vista (2006). In fact, they shipped more OS releases than Apple did during this same time period. These OSes include Windows XP Table PC Edition (two versions), Windows XP Media Center Edition (four versions), Windows XP Service Pack 2 (SP2, a free gimmee to users to make up for security issues), and two versions of Windows Server, among many others. If you're going to make Panther and Tiger seem lke "new versions" of Mac OS X, then you need to include Table PC and Media Center Editions on the Windows side too. Certainly, the Windows OSes were more impressive from a functional improvement standpoint. Geesh.
Sticking with that same core architecture [between Vista and Windows 7] is the problem, not the solution. In April, Michael A. Silver and Neil MacDonald, analysts at Gartner, the research firm, presented a talk titled “Windows Is Collapsing.” Their argument isn’t that Windows will cease to function but that the accumulated complexity, as Microsoft tries to support 20 years of legacies, prevents timely delivery of advances. “The situation is untenable,” their joint presentation says. “Windows must change radically.”
As he notes, this talk was presented way back in April. And it was immediately debunked as utter claptrap. I wrote two responses to this talk, a blog post and an article called Is Windows Broken?, that pretty much sum up why those two clowns at Gartner don't know their microkernel from their microwave popcorn. (Neither does Stross, apparently. This vaguely saddens me.)
Some software engineers within Microsoft seem to be in full agreement, talking in public of work that began in 2003 to design a new operating system from scratch.They believe that problems like security vulnerabilities and system crashes can be fixed only by abandoning system design orthodoxy, formed in the 1960s and ’70s, that was built into Windows.
Um. What? He's referring to a Microsoft research project called Singularity that has absolutely nothing to do with Windows. What a weird stretch to make.
And BTW: That "orthodoxy"? It's older in UNIX. And thus in OS X and Linux as well.
If Microsoft thinks it is too late to actually use Singularity or something like it, the company should take heart from Apple’s willingness to brave the wrath of its users when, in 2001, it introduced Mac OS X. It was based on a modern microkernel design, which runs a very small set of essential services that make the system less vulnerable to crashes. But the change forced Mac users to buy new versions of all their existing Mac applications if they were to run speedily on the new system. It has paid off in countless ways, though...
Sure it did. And like Stross, I'm sure, I recall how OS X couldn't even play DVD movies when it first arrived. Developing a new system--even one based on older technologies like the Mach microkernel and a UNIX derivative called Free BSD--is a huge undertaking. But when you only have a tiny chunk of the market, as Apple did and does, you can take big steps like that. There is absolutely zero evidence that OS today is any faster, smaller, secure, or less buggy than Windows Vista.
A monolithic operating system like Windows perpetuates an obsolete design.
More BS. Windows is not monolithic except in the most pedantic sense (i.e. it does not employ a microkernel, a concept that dates from the early 1990s). In fact, Windows is highly componentized at a very deep level, work that occurred over several years and a few Windows versions to make, get this, Windows much less monolithic than it used to be.
We don’t need to load up our machines with bloated layers we won’t use. We need what Mr. Silver and Mr. MacDonald speak of as a “just enough” operating system. Additional functionality, appropriate to a given task, can be loaded as needed.
We have this today. It's called Windows. Maybe you've heard of it.
What Microsoft has done, however, is arbitrarily decide what software components are included in the desktop versions of Windows and which components you can add and remove. (Windows Server is far more malleable; witness the Server Core version of Windows Server 2008 as the obvious example. There is absolutely nothing like Server Core on the Mac OS X side, Mr. Stross. Indeed, all Apple lets you remove are some international languages and printer drivers, and then only if you perform a clean install of the OS.)
Avadis Tevanian, who worked on microkernel research as a Ph.D. student at Carnegie-Mellon, then on the Next operating system, followed by nine years at Apple where he oversaw the transition to Mac OS X, recalled how the decision was made when Apple’s market share was stuck at 3 percent and the company was losing money.
I guess not much has changed on the OS side. Yes, Apple is making money hand over fist, but its Mac OS X is still "stuck" with 3 percent market share, according to the very latest figures.
Microsoft should move its researchers into the heart of its systems development team. Windows OS X, a just-enough operating system built from scratch, is a product likely to be crucial to its future, too.
I am freaked to be saying this, but you, sir, know absolutely nothing about either Windows or Mac OS X and shouldn't be giving this kind of advice. Shame on you for publishing such a story. Microsoft is right now working on further componentization of Windows ("MinWin"), a project that could very well result in the type of "just-enough" OS that, no, Apple doesn't have today either. But even today's Windows versions (Vista and Server 2008) are architecturally and factually quite different--i.e. "superior"--to what you've described.