Skip navigation

The UNIX Perspective

UNIX Still Has a Better Battle Plan

Rumors of UNIX's impending death are greatly exaggerated. Although there are those who equate column inches to vitality, this one-dimensional measure ignores the reality that often "no news is good news." Just as crime and heroism (and even "hype") are the stuff of headlines, little space is given to the uneventful.

Although there are many "niche players," the computer industry has become stratified into three layers. The high end includes the traditional mainframe market, long dominated by IBM. The low end--such as it is--contains a myriad of PCs, most of which are successors to IBM's original PC.

In between is the realm of "enterprise" servers and workstations. These systems run on hardware ranging from large PCs to massively parallel multicomputers. There is no dominant vendor in this realm, but UNIX has become the operating system of choice for users with large numbers of networked systems. Windows NT and its siblings have become contenders in a subrealm that might be called "local area servers" or "departmental application servers."

The Desktop
Where standard equipment for a white-collar office worker was once a typewriter and a telephone, the former has been replaced by a PC. In most cases, that PC runs one version or another of Microsoft Windows, but there is a sizable segment of the user community that does not. Millions of older systems run MS-DOS or early Windows versions and cannot economically upgrade to meet the requirements of current versions. Millions of other users prefer OS/2 or have chosen Macintosh systems or "true" workstations, which usually run UNIX. These preferences are the last bastions of individuality left in some organizations, which explains why attempts to mandate standard PC configurations often meet with strong resistance, especially among technical professionals.

The long and the short of it is that the desktop will remain a mixed and sometimes contentious environment for years to come. There is no indication that Microsoft's dominance will weaken anytime soon, but it is also clear that a significant number of people have determined that for technical or personal reasons they will not "do Windows."

A rapidly growing segment of students and computer professionals is now using Linux, a free UNIX implementation that includes Windows application programming interface (API) emulation along with "unbranded" implementations of several open systems technologies. It would be unwise to ignore the Linux crowd: They will be the ones making the computer-buying decisions in another decade.

UNIX
AT&T's Bell Laboratories was already among the largest minicomputer users 30 years ago. Bell engineers Dennis Ritchie and Brian Kernighan envisioned an operating system that would provide large-system capabilities on minicomputers for individual researchers. They had been involved in the AT&T/GE/MIT development of Multiuser Interactive Computing System (Multics) and thus included Multics features in their "UNIX" system. Where Multics had been developed in PL/I, Kernighan and Ritchie designed the C language and used it to create more than 90% of UNIX code. UNIX spread rapidly within Bell Labs, and AT&T soon made it available with very attractive licensing terms to educational institutions. Over time, UNIX has become the de facto standard minicomputer operating system within academic and research communities everywhere.

Open Systems Technologies
The open systems movement arose in the interest of providing guaranteed portability of applications from one platform to another. Because conversions between mainframe platforms--and even between dissimilar PC platforms--are expensive propositions, large users in government, academia, and commercial sectors demanded an alternative. At the least, they require APIs that are open, interoperable, and portable.

Early on, the definition of open systems varied from vendor to vendor, sometimes to incredible extremes. Even now, there are those who tout the Windows APIs as open, ignoring the significant fact that they are dictated by a single company without any public consultation with users or system vendors.

With users demanding an end to all this foolishness, the definition of "open systems" has become much clearer, and the various participants in what had been called "The UNIX Wars" have come to terms. There might never be a single definition that satisfies both the purists and the pragmatists of the open systems community; however, several basic principles apply to any definition of "open:"

  • The API definition must be in the public domain or available to anyone at a modest price.
  • The API definition cannot be unilaterally changed or influenced by any one vendor.
  • The vendor must not become a "single point of failure."
  • Reference implementations must exist on distinctly different hardware platforms.
  • Exhaustive tests must exist to prove conformance of any given implementation.

The Windows APIs violate the first three points. These APIs have undergone incompatible changes during each of their last three incarnations, and there is no indication that the future will be more disciplined.

Because of these incompatibilities, Windows upgrades have given the appearance of "churning" the user base to force application upgrades. That's merely a bad situation at the single-
system level, but forcing a group of networked systems to upgrade simultaneously is unacceptable.

The third point has to do with application survivability if there is the failure--or breakup--of the vendor on which the application depends. I don't expect anything drastic to happen to Microsoft anytime soon, but the basic point remains: You shouldn't put all your eggs in one basket.

Other measures of open systems have been suggested, but the five listed protect the basic interests of the user community. Those interests center around software portability and interoperability across a wide variety of hardware platforms.

The user community plays an important role in this process; it's not driven solely by the vendors. This user involvement reflects a fundamental difference between the open systems part of the marketplace and the mainframe and desktop arenas. Unlike the other two, where a single company has become dominant, there is actual competition among open software vendors.

The intent of open systems is to specify and make available a set of common APIs. These APIs should work exactly the same way on a wide variety of systems, some of which run operating systems other than UNIX. Applications that adhere to the standard APIs should compile and operate correctly on any conforming system.

This discussion of open systems is not specifically about UNIX. Other open technologies take precedence over the operating systems on which they are implemented. Two of these technologies are Motif extensions to the X11 Window System and the Distributed Computing Environment. DCE provides a uniform and securable client/server infrastructure to applications running on everything from IBM mainframes to dozens of UNIX and other enterprise server systems to Windows NT and Macintosh PCs. Both of these technologies were specified and implemented under the guidance of Open Software Foundation (OSF).

OSF was created to guide the development of open technologies that can be included in products from multiple vendors. With the agreement upon a common UNIX interface specification and conformance testing, OSF's sponsors and members in North America, Europe, and Asia are a remarkable cross-section of the computer industry and its major users.

A recent positive development has been the new bond between X/Open, which sets standards, and OSF, which concentrates upon implementations and basic research. OSF helps its sponsors pool their resources to produce provably compatible open technologies at a fraction of the cost of each one developing them in parallel. With infrastructure products such as DCE, it's important to have coverage across as wide a spectrum of platforms as possible. Even Windows has DCE client support, which was engineered from the open specification.

The move to open systems has proved beneficial to many users. Organizations that have large networks of heterogeneous systems, including major financial institutions, academic/
research communities, and federal/state governments, have received the greatest benefit. Companies can develop new applications knowing that they can be deployed on a wide variety of systems and that no expensive conversion will be necessary when they are moved from one system to another.

One multinational bank, described by representatives as typical of its class of users, keeps the bulk of its customer information in gigantic mainframe databases at several central sites. Over time, its traditional transaction-processing (TP) applications have been "distributed" so that the mainframes perform only database management (at thousands of transactions per second).

A network of UNIX systems executes the actual banking transactions and supports the user interface while providing message routing and security functions. Small branch offices might use only one or two personal workstations for everything. Larger branches and regional hubs use dedicated systems for the applications and support a variety of personal computers and dumb terminals at the teller windows and bank officers' desks.

Bank representatives said they feel that the bank gets the greatest value from its DCE-based client/server applications. In some cases, the bank will offer a product in a limited area, supported by DCE application code installed in the local office or regional center. If the product is successful, it can easily be installed across the network. With close attention to detail, conversion costs are almost nil.

Furthermore, multinational corporations are often faced with business realities that force them to use specific platforms in various countries. Again, the universality of UNIX and the other open systems technologies make the choice of hardware only a secondary concern.

Choosing the Right System
I don't know how many times a friend has asked my advice on how to buy a system. My first question is always, "What will you use it for?" My second question is, "What will you really use it for?" With a little luck, the answers will resemble one another. Desktop personal computers are rendered obsolete within two years these days. For anything else, my third question is, "In three years, what will you be using it for?"

Many people select home computers for all the wrong reasons. They tend to focus on the latest fad and overlook important features--or the lack thereof. I am convinced that exactly the same problem occurs when selecting systems for the workplace. Here are my suggestions for a few common cases.

I won't argue the case of the stand-alone application system: It runs what it runs. Likewise, many users' needs are met in full by personal productivity tools, such as word processors and spreadsheets. If you leave out a specific application that dictates the system selection, these users would be better served (and more productive in the long run) by trying out competing systems (preferably not limited to any one architecture) and choosing whichever system best fits their work style.

Technical users, including programmers, need a system that is compatible with their major applications (or target platforms), which may well be an enterprise server operating system. Again, the application drives the decision. In my own case, the critical application is the set of tools that my team uses for software development: That justified the UNIX workstation in my office. My workstation also runs our word processor of choice and "groupware," both of which are UNIX versions of popular applications.

Following the concept of fitting the tool to its use, it makes sense to examine "client" systems next. Local systems that are used primarily as clients to remote servers should be optimized for that use. The network introduces a number of new issues, not the least of which is the security--in all senses of the word--of the connections between clients and servers. There are no savings to be had by skimping on the client/server interface.

DCE provides a rich and robust API, is actively (and compatibly) being enhanced, and is available on a wide variety of platforms. The guarantee of DCE is that you don't have to use the same platform for both client and server. You can choose to deploy clients on several different platforms and still use common servers. Because DCE client software exists for both UNIX and Windows systems, I'll call this point a "wash" at the client level, if there are no secondary considerations.

For what I will call local--or departmental--servers, it makes sense to minimize the differences between client and server architectures. In the case of file servers, the server should be chosen to provide service to all the clients on its local network. In this situation, I would expect an NT server to be preferred for use with NT clients: Support becomes that much easier.

A heterogeneous set of clients might dictate a different outcome, however. File servers are usually busy enough that they are dedicated to that task and aren't expected to also run user applications.

Classic enterprise-server applications, which use (at least) multiuser servers, are across the bridge from single-user PCs. Many of these applications fall into the "bet your business" category, including the day-to-day applications that keep products flowing out and money or raw materials flowing in. No company, large or small, can afford outages on these systems.

Nor can companies afford the damage that can occur when "spoofs" are perpetrated by unauthorized visitors. DCE provides mutual authentication within the security domain--and between domains--which is not matched by any widely distributed proprietary system.

For this class of system, applications written to open APIs are the only guarantee of portability later. These applications have life spans that often exceed five years, and they require stable APIs that will be safe from incompatible changes when the underlying software is upgraded.

Use of the open systems APIs also ensures that server applications can be replicated on additional platforms whenever and wherever they're needed. This is where open systems perform exactly as intended. There are improvements yet to be made, but all the essentials exist now, including high-performance TP for those applications that require it.

It is as enterprise servers that the open systems technologies in general--and UNIX in particular--earn their keep. Although the promise of open systems was once derided as a pipe dream, it is now reality and will remain so into the next millennium.

Having moved beyond the factional disputes of a few years ago, vendors and users of open systems now speak with a common voice. The various open systems technology offerings continue to mature and respond to the evolving needs of their users.

Surrounded but Not Replaced
Minicomputer vendors established their presence a decade ago by using their systems to "surround" traditional mainframes. This strategy was successful largely because it allowed the two environments to complement one another. The strengths of each worked synergistically.

Desktop systems now surround enterprise/local-server systems. That's not necessarily a bad thing, and it provides new opportunities for synergy. Open systems and UNIX both solve an important set of problems for their users. Although they have some solutions and users in common, each has its own strengths outside common ground. As the movement continues from isolated user systems to networks of systems, the need for and advantages of truly open systems technologies will become more evident.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish