Intergraph is the largest Windows NT development site in the world; Digital Equipment is number 2; Microsoft is number 3! Intergraph is a US-based multinational company that has been providing both hardware and software products for the technical market for more than 20 years. Several years ago, the company undertook the significant task of converting its line of UNIX-based software solutions to the NT environment.
Originally, the company provided software solutions based on Digital minicomputers and Intergraph's own intelligent graphics terminals. Later, it developed RISC-based graphics workstations running UNIX. During this RISC/UNIX development phase, Intergraph created and marketed about 800 technical applications, including packages for Computer-Aided Design/Manufacturing/Engineering (CAD/CAM/CAE), as well as vertical-market packages for civil engineering, government, mapping, imaging, and publishing.
As a pioneer in RISC-based graphics workstations, the company created a proprietary UNIX-based GUI and application programming interface (API) environment, called Environ V. This is a typical windowing system. It supports drawing graphics and responds to mouse and keyboard input, and so forth. Environ V also includes extensive support for Intergraph's own bit-mapped and scalable-font technologies. A number of more complex development tool layers exist on top of Environ V, including tools for building user interfaces (I/Forms) and tools for network file management (NFM).
After checking out various open models and doing some development in that direction, Intergraph discovered that Windows NT had entered beta testing. When it appeared that NT had the characteristics and facilities the company was looking for, Intergraph decided to check it out.
The first task was to verify that NT had the necessary resources to support large, complex applications. Historically, Intergraph applications had pushed the limits of the UNIX environment, making extensive use of client/server communications, multitasking, remote file access, SQL database access, and interprocess communications (IPC) and relying on UNIX file systems. In addition, Intergraph required device-driver support for graphics displays with 2D and 3D hardware acceleration, graphics tablets, optical jukebox storage, various LAN and WAN technologies, and more.
The Test Case
After verifying that Windows NT was a capable environment, Intergraph began a project to port NT to its RISC Clipper processor. About this time, Intel released the Pentium processor, which appeared to have enough integer and floating-point performance to support Intergraph's technical applications, especially in systems supporting symmetrical multiprocessing (SMP) arrangements of high-performance Pentium (P54C) processors. This enabled the company to support PC compatibles with medium- to high-end graphics displays running the Intel version of NT. (Intergraph plans to support some non-Intel NT platforms in the future.)
I spoke with a number of Intergraph's key developers about porting software from UNIX to NT. In general, they consider the NT platform to be stable and superior to UNIX in many respects, including the availability of development tools (especially Visual C++). They did, however, experience two problem areas:
- Differences in the semantics of socket behavior
- Lack of a UNIX-like fork mechanism
The following sections cover some of the issues these developers faced in porting from UNIX to NT and present some of their personal observations.
1) POSIX Subsystem
At first the existence of a POSIX subsystem for NT appeared to vastly simplify the conversion task. In practice, however, POSIX applications are restricted to text-style interfaces that run in an NT command-line window and cannot include Win32 system calls. Likewise, Win32-based applications cannot issue most POSIX calls. This makes the POSIX subsystem suitable for only simple applications or for server programs that have little or no interface with the users.
2) Portability Layers
Matching UNIX features to those in the NT environment met with mixed results. Some of the UNIX facilities that Intergraph applications depended upon exist in similar forms (e.g., socket-based IPC) while others could be built on top of existing NT facilities (e.g., I/Forms, which was rewritten as Shamrock). Some resources could be converted into NT equivalents. For example, Intergraph's proprietary fonts were turned into TrueType fonts. However, NT also offered new facilities, such as threads, which were used when appropriate.
Many Intergraph applications were written in script languages of middle-level products, such as Microstation (MDL) or Database Access (DBA). In these cases, once the underlying product was ported, the applications could be ported to NT with remarkably little effort. In particular, Intergraph's third-party vendors got an almost-free ride onto NT because most of their applications are written in these script languages.
Intergraph developers also found that although several of the NT system facilities are supposed to be exact analogs of UNIX facilities (e.g., creating other processes and the socket-style interface), they are different enough to cause problems. In addition, Intergraph's I/Forms product used a look-and-feel that didn't comply with the strict NT GUI standards.
Because of these disparities, several of the Intergraph developers would have preferred doing total rewrites instead of porting existing code. However, the period of time required to create clean implementations that complied 100% with the NT GUI standards would have been considerably longer. Given the number and sizes of the products being ported, this was considered an acceptable compromise. However, new development will be more in compliance with NT standards.
As one developer put it: "Because Intergraph had ported to so many versions of UNIX, we had developed portability layers which took care of the variations. These layers allowed us to port to NT quickly, but they also isolated us from features specific to NT. The biggest issue was in the GUI area where Intergraph had a large investment in forms for screen layout. Since NT seemed \[to be\] lacking in a highly productive Forms Builder, \[we\] decided to port the existing UNIX utilities. The effect was a rapid transition but a resulting product which fell short of the Windows look-and-feel. A UNIX vendor moving to NT should make a commitment to strict compliance with the NT GUI style.
"If we had known in advance about some of the pitfalls we would encounter, it would have been obvious that we should rewrite the code almost from scratch. However, since our base platform changed while we were in development--
due to all those NT beta releases--and we had a tight schedule to meet, we didn't have the luxury of learning more about NT before we had to start programming with it. That was probably the source of many of our difficulties."
3) Sockets IPC
Sockets refers to an IPC mechanism originally designed at the University of California at Berkeley, which can be thought of as a variant of named pipes. Basically, one process "publishes" an outgoing socket, and another process--on the same, or another machine--connects to that socket. Once the connection is established, either process can exchange information reliably and in synchronization with the other process.
Windows NT and various implementations of TCP/IP under Windows 3.1 implement an IPC scheme called WinSockets or WinSock that is fully compatible with UNIX's ability to connect to and exchange data via sockets. The WinSock API is also similar to--but not exactly the same as--the UNIX Socket API. In particular, once a UNIX socket connection has been established, a file handle is returned that you can use with the same I/O procedures that you would use to read and write files. In WinSock, however, you can use only the specific procedures provided in its API to read and write data. Furthermore, socket inheritance by a created process in NT is slightly different from socket inheritance by a forked process in UNIX. These two issues required a fair amount of recoding of IPC routines.
According to one of the development-team members: "The hardest part of moving from UNIX to NT was finding the 'hidden differences' between UNIX and NT. In a lot of cases, we had to rewrite some code because, while NT looked the same on the surface, the underlying mechanisms were completely different from UNIX's. The WinSock API was the source of a lot of these misunderstandings. The library calls were the same, but, for instance, sockets are not inherited over a Create Process, as they are for an exec(), and sockets cannot be used with function calls that use file descriptors (i.e., read, write, close)."
4) Child Processes
In UNIX, you use a fork function to create a child process. There are several subtle variations on exactly how to do it. NT has a similar mechanism called Create Process. However, the exact semantics of how it works, how much of the parent's environment, and how many of its facilities are inherited by the child process differ from those for the UNIX mechanism. The differences necessitated numerous changes. In the words of one of the developers: "My biggest wish for an NT feature is a plain and simple fork()."
5) File Systems
In UNIX, there are two primary file systems: SS51 and Fast File System (FFS). Both are fairly versatile and robust and support lots of attributes and permissions. For CD-ROM, both UNIX and NT support the CD File System (CDFS) standard. The NT File System (NTFS), supported by NT, and even the High-Performance File System (HPFS) from OS/2 are in the same class as SS51 and FFS but are different enough to cause a few problems in porting complex software. In contrast, the File Access Table (FAT) file system, used by DOS and supported by NT, is considerably less sophisticated and robust. Currently, if you have a dual-boot system--DOS and NT--only the FAT file systems are visible to DOS, so many users configure all or part of their disks in this format. (Even DOS applications running under NT have limitations in accessing NTFS features.)
One of the developers recalled: "We had to decide if we should be bound to NTFS. We decided to adopt a policy of file-system independence, meaning that the products had to install and run on either FAT or NTFS. This turned out to be a good decision since many users will want to run on FAT to allow them to boot DOS when needed. It also turned out that most of the security features in NTFS are available without impacting the application source. If you use NTFS, you pick up the security features automatically."
6) NT Development Tools
Under UNIX, development tools range from backward to sophisticated, but few, if any, have the polish expected in the PC market. With the potential rewards so much greater in the PC arena than in the UNIX world, there is intense competition and innovation in PC-oriented compilers, application frameworks, and debugging tools. Many of these traits have carried over into NT.
One powerful feature of the Microsoft development tools for DOS and NT is the Microsoft Foundation Class (MFC). This is a set of C++ class wrappers for Windows API calls that help to move you into a more object-oriented paradigm and hide most of the differences between Win16 and Win32 APIs. (You can build applications for either environment from a single source file by changing the Nmake file). However, as nice as MFC is, the current release still has a long way to go in terms of threads and internationalization.
As one developer observed: "We found that Visual C++ is the leading-edge language, but it was decoupled from some NT activities at the outset. For example, it's not thread-safe, and it's not i18N-enabled. \[The i18N standard deals with internationalization.\] This lack of clean integration has caused some problems."
Another developer commented: "It is impossible to compile anything complex under NT from the command line (just try to do the equivalent of cc asdf.c). An Nmake file is a must. And Nmake, despite its name, is not Make. We found it impossible to port UNIX Make files."
Windows NT is designed to support threads (lightweight processes). Threads let you organize a single program into multiple, mostly independent pieces--although there are ways for them to exchange information and synchronize when required. On an SMP machine, the threads can be distributed across more than one processor. Only a few UNIX variants, such as SCO SMP, have a similar capability.
One of the developers summed it up this way: "We found threading to be a very daunting task on large, layered applications. You must distinguish between thread-safe and threaded. It's difficult to build on foundations which aren't 100% thread-safe. It also \[requires a major redesign for\] applications to become threaded. Without threading, \[however,\] some applications get little or no benefit from multiprocessors."
8) Local vs. International
Localization and internationalization are two approaches to making software available in multiple languages, such as English, German, and Japanese. The problems involved in supporting many Asian languages are considerably more complex than they are for European languages. For one thing, the character glyphs are more complex. For another, Asian languages have more than 256 glyphs--hence, more than eight bits are required to represent a glyph. An ANSI extension called Unicode addresses many of these problems and supports the character sets for most European languages as well. Unicode uses 16 bits per character).
Localization requires modifying your source code to change all English text to the new language. You typically end up with one copy of the source code for each language. Internationalization requires developing your program in a generalized way--in the mathematical sense--so that all language-specific aspects are enclosed in a resource file. Then to port your application to a new language, you just create a new resource file. The original source code and design remain unchanged. If done properly, all text-related code supports full 16-bit Unicode. This allows you to create resource files for Asian languages. The international standard that specifies all these details is called i18N.
With Intergraph's extensive international sales and operations, internationalization is a high priority. Unfor-
tunately, it is only beginning to be a high priority for Microsoft. According to one of the Intergraph developers: "We tried to use Unicode but found a lack of commitment from Microsoft and other vendors, so we backed off. Microsoft is still a localization company, but it is trying to move to internationalized code. It still doesn't have a consistent i18N story across its operating systems. MFC is not internationalized and still has invalid string assumptions. Internationalization was extremely difficult."
9) Object-Oriented Design
Intergraph developers have been working with object-oriented design techniques for about eight years. One of the development layers they created for UNIX was called Object Manager. A sophisticated, high-end CAD package called Engineering Modeling System (EMS) and several other advanced products were built using this technology. Rather than port this layer, Intergraph embarked on an ambitious project to create an entirely new object-oriented development layer for NT and the more object-oriented Cairo. The developers on this project have found that the object-oriented tools and environment available under NT are significantly better than those that were available when they created Object Manager under UNIX.
As one developer observed: "The primary difference is that Microsoft is much further along the path to true objects \[being\] built in as part of the operating system than any of the UNIX vendors are. OLE connections are here now--and \[they\] work." Another commented: "The foundation tools provided in the NT world are much better object-development tools than \[those\] generally or widely available under UNIX. \[The NT tools\] offer new ways of programming which are more productive than anything available under UNIX."
No Pain, No Gain
As you can see, porting applications from UNIX to NT is not as straightforward as it may appear, and it's certainly more difficult than porting an application from one UNIX implementation to another. Virtually every step in the process results in new discoveries and requires new decisions. However, once you work through the difficulties, you'll find that porting applications to NT opens up new possibilities for hardware platforms, high-end performance, enterprise-wide integration, and market penetration. In short, the gain is definitely worth the pain.
|Intergraph * 800-291-9909|