Skip navigation

A Stack of Magazines 100 Issues High

A leisurely look at the history of NT

Windows & .NET Magazine is 100 issues old! The magazine has changed its name a couple of times, but then so has Windows NT. I first heard about NT back in 1989—when IBM and Microsoft had worked on OSs together since 1981 and seemed like they always would. At a technical briefing for geek journalists, Microsoft was showing us early betas of OS/2 2.0, the company's first serious shot at an OS that exploited the capabilities of Intel's 80386 chip. Multitasking. A good GUI. A lot of memory. We ooohed and aaahed. But then Microsoft started telling us about the even more amazing OS/2 3.0, which it was developing concurrently.

What we didn't know then was that OS/2 3.0 wasn't actually a variety of OS/2 but rather the working name for a new OS that had its roots in Digital's VMS OS. The system's primary architect, Dave Cutler, came not from a PC background but from the minicomputer arena. Microsoft was using the OS/2 name primarily for marketing continuity, but OS/2 3.0 received the Windows NT moniker when IBM and Microsoft parted their decade-long OS alliance in 1991.

Forgotten Goals
NT has become a huge success, but as I look back, I realize that almost none of Microsoft's original intentions for NT turned out to matter—or they disappeared altogether. For example, NT was supposed to be a multipersonality, architecture-independent, microkernel-based OS. At the time, the term multipersonality described the notion that an OS could run software from multiple platforms. NT would feature subsystems that permitted the OS to run both Windows and non-Windows software directly or with minimal porting. The first version of NT had three subsystems. The first was the Win32 subsystem, which ran native 32-bit Windows programs. That subsystem has survived through Windows 9x, NT, Windows 2000, Windows Me, Windows XP, and Windows Server 2003, but it has grown and matured over the past 10 years. The Win32 subsystem also supports the NT Virtual DOS Machines (NTVDMs) that administrators still use to run DOS or 16-bit Windows programs.

You might remember the other two subsystems, OS/2 and POSIX. Microsoft designed the OS/2 subsystem to run OS/2 applications—or more specifically, OS/2 1.x applications. The intention was that you could run applications built for LAN Manager—an OS/2 1.3—based product—on NT (or port them to NT). The POSIX subsystem was a UNIX application environment that, as far as I know, could never run any UNIX programs out of the box but instead let you port UNIX programs to NT. This functionality received much press at the time of NT's release, but I don't recall any UNIX programs actually being ported to NT's POSIX subsystem, except for UNIX tools (e.g., chown) that have shipped with resource kits for years. Despite all the hoopla over NT's sparkling personalities, they're gone now—OS/2 disappeared in Win2K and POSIX in XP.

NT also promised to be an extremely reliable, architecture-independent OS—a huge idea that required PC developers to change the way they thought. The OSs of the 1980s changed by acquiring more bells and whistles (e.g., memory, multitasking, GUIs), but reliability wasn't a chief concern. In the mid-'80s, mainframe computers could run for weeks, months, and even years without a reboot. In contrast, the idea of stabilizing a PC-based computer so that it required only one reboot per day was considered the stuff of dreams. All the OS/2 developers that I talked with agreed that reliability was important, but their PC experience seemed to lead them to set modest reliability goals. A popular goal was that if a PC didn't crash more than a couple of times a week, it was fine.

Cutler expected OSs to run and run and run—and NT was no exception. Bob Muglia, a member of the NT team, said to me in 1992, "Once you get your server shaken down properly, you should never have to reboot it—or we've failed." Muglia was exaggerating, but today I know of many NT-based systems (including Windows 2003, XP, and Win2K systems) that need a reboot only for hotfix installations, server moves, or service pack applications. Obviously, not all NT-based systems are so reliable, but try installing LAN Manager 2.0 and see how long you can keep it running on your system.

To push NT to the level the team wanted—improving the OS's reliability, for example—Cutler needed to yank his developers out of their PC-oriented, Intel-centric mindset. To do so, rather than build the OS around the 386 architecture, he told the developers that NT's design should permit implementation on virtually any 32-bit processor. What better way to break developers of Intel habits than to build the first version of NT not on a 386 but on a MIPS R4000 chip—a powerful but completely Intel-incompatible microprocessor. An intriguing aspect of that first demonstration of OS/2 3.0 was that Microsoft used a MIPS-based computer. I remember thinking, "Perhaps the world is changing. Maybe someday soon we'll debate whether to run NT on an Intel box, a MIPS machine, or Macintosh hardware!"

That debate never occurred. Whereas NT 4.0 supported four completely foreign architectures—Digital Alpha, Intel x86, PowerPC, and MIPS—by the advent of Win2K (aka NT 5.0), the roster of processors that NT supported had shrunk to one: the Intel x86.

Fairly soon, you might again have the ability to choose among alluring architecture options. NT already supports the Itanium architecture, which admittedly isn't exactly taking the world by storm. Windows 2003 will almost certainly support (although not in its initial release) AMD's Opteron chip, an Itanium challenger that might offer an attractive price/performance ratio.

Why NT Succeeded
As you know, NT succeeded, but not because of how well or badly its intended multiple personalities or range of supported processors worked. NT has done well for several reasons: It's a Microsoft product, it's always had a wide variety of functionality "in the box," and it's pretty reliable for a PC-based OS.

When NT shipped as a shrink-wrapped product in 1993, I installed NT 3.1 Advanced Server on a computer that had been running LAN Manager. LAN Manager, which in my experience never ran particularly well, hadn't seen much use on my system. My expectations for NT 3.1 were fairly low—I figured that, like LAN Manager, NT would crash in interesting and amusing ways, perhaps generating a good story or two for my classes, consulting, and columns. In short, I came to scoff, but I stayed to play.

NT's Setup program ran smoothly, installing a good, basic file-and-print server on my former LAN Manager computer. In the coming months, I found that NT could run for weeks at a time without requiring a reboot. The OS's built-in support for the NetBIOS programming interface and the NetBEUI protocol worked smoothly with my company's Windows for Workgroups client software. And NT was fairly fast, producing snappier network response than LAN Manager had.

Even more impressive was the array of stuff that NT included free. At the time, our main corporate network ran Novell NetWare 3.11, so we were accustomed to scrambling for cash whenever we wanted NetWare to, for example, support remote access. In contrast, NT 3.1 featured built-in remote access, support for both the NetBEUI and IPX protocols, a NetWare client, and a bunch of other cool stuff. You didn't need to pony up to Microsoft every time you added a new user: For a flat $1500, you got server software that permitted an unlimited number of user connections. As time went by, NT continued to add desirable features—for example, TCP/IP, a Web server, and firewall software (in Windows 2003)—to its OS at marginal cost.

Much of the reason that Microsoft gave all that away was that, at the time, the company wasn't yet at the top of the network server software game. Giving away features for which other vendors charged or for which you needed third-party software grabbed the attention of cheap network administrators like me. Clearly, Microsoft intended many of those free items only to set the hook; by NT 3.5, Microsoft was charging by the client.

The Myth of Security
Success came slowly to NT, and as it did, the myth of NT security rose to prominence. Many proponents of NT 4.0 and NT 3.51 loved to brag about the OS's inherent security and freedom from attacks. At the time, I had a suspicion about the reason for NT's security: So few people used NT that hackers didn't bother trying to compromise it. As NT's popularity increased, administrators gradually realized that the OS contained security holes through which you could drive an asteroid. However, this part of the NT story seems to be moving toward a happy ending. Today, Microsoft is doing a much better job at securing NT.

As Windows & .NET Magazine begins producing its next 100 issues, I think our chosen OS will see some good and—I fear—bad changes. On the good side, I think Microsoft mostly understands the importance of security. New developments such as Palladium could make security ubiquitous and, more important, simple—many systems today are unsecure not because securing them is impossible but because securing them is too much trouble. On the bad side, Redmond is in the driver's seat now.

Perhaps the most pernicious example of Microsoft's supremacy is its licensing. Sometimes, you feel as if you need a law degree, an IQ of 250, insider information, and astonishing patience to understand the minutiae of Microsoft's software licensing. But everybody knows why the company's licensing structure constantly changes: to maximize revenue. The question of whether that's good or bad is a question about the future, and this column has been a look at the past. For the answer, check back with us at issue 200.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish