The resistance to Web services and subscription software in general, and to Microsoft .NET specifically, seems to never end. Why are computer users so stuck in today's outdated computing model that we can't see the benefits of moving toward a distributed, interconnected environment in which the sum is greater than the parts? What the heck happened here?
Let's step back a bit. In the early 1980s, Apple Computer pioneered the use of simple graphical OSs with the Lisa computer, which was quickly overshadowed by the even simpler Macintosh. The early Macs had tiny 9" black-and-white screens and shipped with so little RAM that the underlying OS was all but useless. But these machines triggered a renaissance of sorts, one that we're still unwittingly participating in today, almost 2 decades later.
That renaissance is so-called desktop computing, in which users employ a mouse and keyboard to interact with a PC that displays an interface based loosely on a physical desktop. Thinking of this bizarre desktop metaphor—which we blindly accept as if it actually makes sense—always reminds me of early personal information manager (PIM) applications, which graphically resembled our old day planners, complete with on-screen tabs and ring binders. Or the early phone-dialer applications, which included graphical telephone handsets and onscreen phone-style touch pads that users could click with the mouse. We roll our eyes at products like these today. Why doesn't the equally outdated PC desktop elicit the same response?
To be fair, products such as the Mac and Windows have at least attempted to abstract the desktop. Users interact with an interface called a desktop, as well as with virtual file folders and documents. But these entities don't graphically resemble their real-world counterparts (and we can be grateful for that: some UIs actually have attempted to graphically duplicate actual desks). But when the desktop-based UI went mainstream in the 1980s, most users' machines at best had dual 3.5" disk drives that offered less than a megabyte of disk space. The elite few had tiny hard disks. In both cases, the desktop metaphor made sense because of the limited computing resources available.
Flash-forward to 2002, and the computing power we have at our fingertips has increased exponentially. However, the UI we use remains largely unchanged. Sure, it's prettier. But we still use the outdated desktop (although Windows XP goes a long way toward melding a task-based interface with the desktop). We won't make any significant improvements in productivity until we leave the desktop metaphor behind, and the reason we must do so has less to do with increased PC storage space than with the world outside our office window.
We live in an interconnected world in which more resources exist outside your PC than on its hard disk. The problem we face is that the supporting structure we need to bring all the elements of our interconnected world together doesn't exist yet. We can get breaking news, up-to-the-second weather reports, and the latest sports scores at the click of a mouse. We can publish and subscribe to online electronic calendars and access our desktop files remotely. And we can interact with people on the other side of the globe, in realtime, by using everything from text-based chat applications to graphically rich multimedia games with blazingly fast action. The elements are all there.
And yet, mindlessly, we boot up our PCs every day, load Windows, and launch desktop applications, just the way we did in 1985. The OS and the applications we run offer small "hooks," Band-Aid-like add-ons that provide some interaction with services on the Web. And we've taken baby steps, such as Microsoft's failed Active Desktop and Internet Explorer Channels, to try and shoehorn the new paradigm into the desktop metaphor. But like the text-based DOS applications of the late 1980s that incorporated mouse support, those products were too little, too late.
And that situation mirrors where we are with .NET. The .NET technologies offer a new way of working, a new way of accessing information, and a new way of computing. The situation is confusing and different, and it scares people, although many often can't explain why. As I've discussed in .NET UPDATE before, however, the biggest problem is that Microsoft and the companies that support .NET technologies have done little to demonstrate why and how .NET can make life better. They've been touting the future for 2 years, with little to show for it.
My hope for .NET is that Microsoft will deliver something tangible and useful, such as Apple did with the first Mac—something that makes people question why they're stuck in the past. Because if .NET does just the same old thing, then it will be nothing more than the next phase of desktop computing. But I believe .NET is more important than that. Unfortunately, Microsoft has produced too little in the way of concrete innovation to back up my statement. I still feel like a .NET apologist.
Maybe everything will change with Longhorn, the next Windows release, due in 2005. I've heard whispering about amazing full-motion video-based UIs, database-backed storage schemes, and more pervasive hooks into cyberspace. But, exciting as all that sounds, Longhorn still looks a lot like a desktop OS to me. I hope Microsoft is thinking beyond the desktop. Windows needs to be a Web services consumer above all, an OS that doesn't get in the way of our data, wherever it might be stored.
Or maybe we'll simply glide along with our desktop and folders, while Microsoft adds more .NET hooks until Windows becomes an unwieldy Frankenstein's monster of functionality, with parts culled from various unrelated technology and fused together in a way that sort of works but doesn't necessarily make sense. I hope that's not where .NET is headed, I really do.