Skip navigation

NT vs. Linux

Three benefits of the people's OS

New OSs fascinate me. I love to try them out, hoping to find one that offers reliability, flexibility, and compatibility. In the past 20 years, I've worked with numerous OSs, including Digital Research's CP/M and its successors CP/M-86 and MP/M; Radio Shack's TRS-80 systems' TRSDOS; the Atari 520's TOS; Amiga's GEM; the University of California at San Diego's UCSD P-System (an early Java-like approach to machine independence); IGC's VM/386, Microsoft Windows/386, and Quarterdeck's DesQ (three attempts to build multitasking DOS OSs); Microsoft OS/2; IBM OS/2; Banyan VINES; Novell NetWare; and of course Windows NT. None of these OSs have provided all the functionality I need. Some have fallen short because of technological limitations, some because of market acceptance, and some for both these reasons.

The latest high-profile OS is Linux. Linus Torvalds created this UNIX-like system when he was a college student in 1991, and thousands of other programmers have since expanded the OS. One of the reasons Linux has gained so much attention in the past 2 years is that so many people have donated their time to work on the OS (i.e., few Linux developers are receiving pay for their efforts).

I've recently learned a lot about Linux because I'm writing a book for NT experts that explains what Linux is, what the OS can do, and how it can make NT administrators' jobs easier. Some of Linux's strengths that I've discovered include its ability to run as a server without requiring a GUI, its robust built-in tools, and its remote-controllable nature.

Optional GUI
NT's GUI simplifies users' interactions with computers and makes NT easier for new administrators to learn than older network operating systems (NOSs) such as NetWare 3.x and 2.x are. However, GUIs sap a computer's resources, stealing memory and CPU power that the system could devote to running server applications. I've often wished that NT were a 32-bit OS that you could easily boot to a DOS-like command-line interface. I'd love the option to start up the GUI when I needed to use it for an administrative tool and shut down the GUI when the server was simply performing server functions. When the GUI isn't using RAM and CPU power, these resources would make the NT machine faster and more stable. The server would therefore be a more efficient domain controller or WINS, DNS, or DHCP server. Unfortunately, NT's GUI is tightly integrated with the OS.

In contrast, Linux's GUI isn't tightly bound to the OS. You can boot Linux to the command line and not run a GUI. This feature is one of the OS's greatest strengths because if you avoid the GUI, you can run the OS on fairly slim hardware. For example, a 100MHz Pentium processor with 32MB of RAM runs fine as a DNS or Web server on Linux.

Another strength of an OS that boots to a command prompt is reliability. The lack of a GUI results in fewer moving parts, which means fewer things to go wrong. In NT, a bad graphical display driver can prevent the OS from booting. Because Linux can run without a GUI, this problem doesn't occur.

Yet another of Linux's benefits over NT is that because you can use the command line to do most of your administrative work in Linux, you can easily script common administrative tools. Because putting mouse clicks into batch files is difficult, if not impossible, NT administrators are constantly looking for command-line analogs to the common GUI administration tools.

If you want your system to boot to a GUI shell, the setup programs in Linux's Red Hat and Caldera flavors make the task simple. If you use the GUI, set your screen resolution to the maximum available. Red Hat's default GUI, called GNOME, includes a font model that looks awful at 640 * 480, acceptable at 800 * 600, and good at higher resolutions. You need a lot of hardware to get good performance from a Linux GUI system—the minimum I recommend for smooth operation is a Pentium II processor with 64MB of RAM. I've heard that Linux isn't as silicon-hungry as NT is, but perhaps GNOME is more resource-intensive than other Linux GUIs are. (Of course, you can probably run GNOME on a 100MHz system with 32MB of RAM, just as you can run Windows 2000—Win2K—on such a system, but I doubt that you'd enjoy doing so.)

Robust Built-In Tools
Early PC-based NOSs tended to contain a small, focused set of tools, and many firms tried to nickel-and-dime customers who wanted to build full-featured networks. For example, 15 years ago, 3Com sold PC networking software as separate print and file server products. Similarly, NetWare 3.x didn't initially include a remote access tool—you needed to buy a separate asynchronous gateway module.

NT 3.1 was unique because it included several standard modules (including a dial-up module). Microsoft has continued this trend, adding a Web server, HTML editor, DNS server, and other pieces.

Linux's standard tools go far beyond NT's. Linux includes an Internet mail server module, a multitude of IP routing protocols, a high-powered graphical paint and draw program, a Samba module that lets the OS attach to an NT file server or to appear as an NT file server, and a basic firewall tool. Furthermore, these tools are reliable because Linux developers ported the code for the tools from UNIX programs that millions of people have used for many years. For example, Linux's DNS server module tool comes from the Berkeley Internet Name Domain (BIND) code that UNIX has used in various forms since the mid-1980s to run the worldwide DNS hierarchy.

Remote Control
The difficulty of remotely administering an NT server has always irritated me. Clever administrators have learned tricks such as using Remote Command (RCMD) with regini or regedit. However, in NT, the remote administration experience is quite different from the local administration experience. Each type of administration requires you to learn a separate set of tools. This requirement makes sense because PC OSs have always been tightly bound to the local keyboard and display. In fact, until recently, most PCs weren't networked and therefore didn't need to interact with other keyboards or screens.

Linux's UNIX roots come in handy in the remote-control arena. The original UNIX machines were expensive minicomputers with terminals attached through serial ports. The only difference between a local and remote connection was that local connections tended to be faster (4800bps to 19,200bps) than dial-up connections (110bps, 300bps, or 1200bps). But the software to communicate through a serial port is the same whether the port connects to a terminal directly or through a pair of modems and a phone line. Even now, after UNIX has gained a GUI, starting a graphical session remains as easy on a distant machine as it is on a local machine (assuming you have permissions to start a session on the remote machine). Thus, if you need to administer a Linux machine in another country, you need only telnet to the machine. But if you need to administer an NT server in another country, you must travel to the machine.

Not Without Its Faults
Despite Linux's benefits over NT, the OS isn't all wine (WINE, another free Linux tool, is a subsystem that lets you run Windows programs under Linux) and roses. In my next column, I'll discuss some of Linux's weaknesses.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish