Skip navigation

NT News Analysis - 01 Oct 1998

Last August, a group of ethical hackers who call themselves cult of the Dead cow (cDc) released Back Orifice (BO), a supposed remote administration tool for Windows. With BO, hackers can remotely control any Windows 95 or Win98 computer across a TCP/IP connection. As the cDc Web site puts it, "BO gives its user more control of the remote Windows machine than the user at the keyboard of the remote machine has."

Any action that you can perform at the local console, hackers can perform remotely, including editing the Registry and executing applications. In fact, BO provides more detailed process control than the local console, giving hackers the ability to spawn and kill processes at will. Hackers can also access any resource that you can access, including network resources.

The BO executable, which is only 120KB, is easy to use yet hard to detect. Hackers can attach BO to any executable, including self-extracting ZIP files. BO will install itself and then remove its installation files. The program will launch each time you boot the host computer, but it won't appear in the Task or Close Program list. As a result, if you download a self-extracting ZIP file that contains BO, you won't even know of BO's existence on your system.

You might be thinking to yourself, "But for this program to hurt me, a hacker would have to know the IP address of the infected machine. How would the hacker get that address?" If you've downloaded a file from the Internet, you've left an IP address trail. Even if you haven't, BO has yet another feature: It accepts third-party plug-ins.

One of the first plug-ins to appear is Butt Trumpet. (Although the names Butt Trumpet and BO are juvenile, they don't lessen the seriousness of the problems they can cause.) Butt Trumpet lets hackers send an email to a preset Simple Mail Transfer Protocol (SMTP) server's email address to identify the IP address of the system. SMTP email headers commonly include the route the message took from the sender's IP to the machine that receives it. An account set up on an anonymous remailer or a Web-based mail host (such as Hotmail) can ensure the anonymity of hackers. Once hackers establish BO on one machine in a network, they can propagate BO throughout the network with little trouble.

One scary feature of this tool is that it requires no technical skills to use. Anyone who can create a self-extracting ZIP file can create a BO attack. Hackers needed technical skills to exploit most of the security holes previously found in Windows.

At press time, Microsoft was downplaying the significance of the potential problems that BO can cause. Microsoft states, "Back Orifice does not pose a threat to users of Windows 95 or Windows 98 who follow reasonable and safe Internet computing practices, such as not installing software from unknown and untrusted sources....There is no threat to customers of Windows NT Workstation or Windows NT Server; the program does not run on the Windows NT platform. The author\[s\] of Back Orifice do not directly claim that their product poses any threat to Windows NT, even though it seems to be implied." You can read Microsoft's full response at http://www.microsoft.com/security/bulletins/ms98-010.htm.

At least one vendor, Fresh Software, has released a product it claims automatically detects and removes BO. For more information about the product, AntiGen 1.0, go to http://www.arez.com/fs.


Yet Another Xeon Delay
In the past 6 months, I've reported so many different Xeon delays in NT News Analysis that I've assigned the phenomenon an acronym: YAXD. Pronounced yawks-dee, this acronym stands for Yet Another Xeon Delay.

This time around, Intel is delaying the 450MHz version of the NX chipset until early 1999. This critical release supports 4-way symmetric multiprocessing (SMP) with the new 2MB Level 2 cache version of the Xeon CPU and up to 8GB of RAM.

The 450NX delay comes on the heels of a similar setback with the 400MHz version of the same NX chipset. That delay cost server vendors dearly. Major players, such as Compaq Computer, are only now bringing their 4-way, 400MHz (with 512KB to 1MB Level 2 cache) NX-based systems to market, even though the vendors announced the systems last spring.

Intel is denying the rumor that the delay is a result of a bug in the chipset. Instead, Intel claims that the delay is the result of conducting more extensive compatibility tests. According to company officials, Intel wants to test as many configuration permutations as possible because of the 450NX platform's likely popularity.

Chipset delays of this magnitude tend to have a ripple effect through an entire product line. Many industry analysts are predicting that a delay in the 450NX chipset might affect the introduction of Profusion, the long-awaited 8-way SMP platform. Although Intel officials deny a possible Profusion delay, at least one OEM customer claims to have heard that Intel will not release Profusion in the fourth quarter of this year as originally planned.

One introduction that the 450NX delay won't affect is the 450MHz (2MB Level 2 cache) Xeon CPU for workstations. According to Intel, the new CPU and its supporting chipset are on track to debut in systems by year's end.

Xeon will be the newest addition to an already crowded Pentium II processor family. With so many models to choose from and with high-end performance separated solely by clock speed and cache size, many corporate customers are renewing their interest in the low-end of the P6 market (Celeron and its derivatives). This renewed interest translates into margin erosion for Intel. Unless Intel finds a way to drive high-end PC sales, Intel risks stalling the very market it is trying to rev up with its multichip strategy. (For information about workstation PC pricing, see "FTC Is Helping Keep Alpha Alive," page 40, and "Workstation vs. High-End PC Hardware," page 44.)

Although this news is bad for Intel and its OEM partners, it is good for consumers. It might mean lower prices for today's hottest systems.

NT Workstation on the March
Don't look now, but is that Windows NT Workstation 4.0 running on the PC next to you? If you're working in one of the several hundred Large Organization (LORG) companies that Microsoft surveyed, chances are that it is. According to Microsoft, NT Workstation 4.0 is now in use on 25 percent of all PCs in organizations with more than 2500 desktops (Microsoft's definition of a LORG).

Much of this market penetration has occurred in the past 12 months. The installed base of NT Workstation seats has grown by more than 125 percent, or 15 million units, in just a year. According to Microsoft, 79 percent of the LORGs have migrated or are planning to migrate to NT Workstation.

Vendors, too, are getting into the NT action. Microsoft reports that 14.6 percent of all new OEM systems ship with NT Workstation, compared with 5.4 percent last year. And about 10 percent of all new notebook systems and 20 percent of new business PCs ship with NT Workstation preinstalled.

What has been fueling the NT Workstation fire? Performance. According to independent benchmarks Microsoft commissioned and the National Software Testing Laboratories conducted, NT Workstation is as much as 30 percent faster than Windows 98 when running 32-bit applications on a 64MB system. On a 32MB system, NT Workstation is still 10 percent to 15 percent faster.

The biggest hurdle to further market penetration is perception. Many customers still believe that NT is either too big for their systems or too expensive to deploy. However, new research data dispels both myths. Of all the new PCs shipped in the first half of 1998, 67 percent had 32MB or more of RAM installed, which is more than enough to run NT effectively. And although upgrading to NT Workstation is more difficult than upgrading to Windows 98, data from Technology Business Research shows that, once NT is installed, an end-to-end NT environment can save companies up to $483 in support costs per calendar year and reduce Help desk calls by 29 percent. Other debunked myths include a lack of device support (6500 devices now support NT, an increase of 63 percent in 2 years), a lack of mobile computing support (all major OEM vendors now ship notebook solutions based on NT Workstation), and the inability to run legacy 16-bit applications (the top 145 line-of-business application vendors now offer NT solutions).

Don't be too surprised if the next time you press Ctrl+Alt+Del at a corporate PC, the Windows NT Security dialog box greets you. NT Workstation is an idea whose time has come.


In My Opinion: Microsoft Needs to Clean Up Its Bad Boy Image
"Windows NT everywhere!" is the rallying cry on the Microsoft campus. Microsoft is pushing its NT-centric vision of the future as NT takes on new and different challenges. From thin-clients (see NT News Analysis: "NT 5.0 to Include Terminal Server," September 1998) to embedded devices (see NT News Analysis: "Embedded NT to the Rescue?" August 1998), vendors are adapting NT to service some decidedly nontraditional computing scenarios. Simply put, NT is becoming the new common denominator among enterprise computing platforms.

However, many in the industry are beginning to question whether a ubiquitous NT is a good thing. Skeptics' concerns have less to do with NT's technical merits (most customers believe NT is architecturally sound or that it soon will be) than with Microsoft's maturity as a vendor. Simply put, some in the IS community don't trust Microsoft. Buggy service packs, slipstreamed upgrades, and other IS-unfriendly support policies and actions are giving Microsoft the reputation of being a backdoor rebel—a company that's willing to further its agenda at the expense of others.

What does Microsoft need to do to clean up its bad boy image? For starters, it can erect a wall between its marketing groups and its service and support groups. Too many so-called maintenance releases are thinly veiled updates designed to bypass formal testing. For example, if you combined all the minor tweaks and major DLL updates from Service Pack 3 (SP3), various hotfixes, and the Option Pack, you'd have the makings of a major upgrade—NT 4.5. By releasing the code piecemeal, Microsoft effectively avoids the painful and often time-consuming regression testing that enterprise customers demand.

By circumventing testing, Microsoft is alienating the people it needs to be courting: the IS decision makers and field implementation technicians who must make all the unstable code work together. "It's like building a sand castle during a rainstorm," said one frazzled integrator. "Just when you think you've got the system stable, along comes a new technology torrent to undermine the foundation and topple the spires."

With NT adoption rapidly increasing and with NT 5.0 being radically re-engineered, the situation will probably get worse before it gets better. I hope Microsoft will take a cue from its enterprise partner IBM and work to stabilize NT once the operating system (OS) reaches critical mass on the new NT 5.0 kernel. After all, network computers (NCs) and archrival Netscape no longer seem to be a threat, so Microsoft can slow down a bit and work on improving its image.

Informix Gets Serious About NT
Although Informix has sold Windows NT versions of its Informix Dynamic Server for years and offers more than 25 DataBlades for NT, the industry persists in viewing this database company as a UNIX-based system supplier. (DataBlades extend Dynamic Server by supporting nontraditional data types, such as audio and video.) To shatter this misperception and gain more shares in the NT market, Informix is getting serious about promoting and supporting its NT-based online transaction processing (OLTP) systems.

Informix's more aggressive promotion of its NT-based OLTP systems began in June, when the company started offering free licenses for Dynamic Server to developers. Informix's strategy for more exposure worked: In a month, more than 2000 developers qualified for the free download. (For more information about the free license, go to http://www.informix.com/informix/products/freent.htm.)

Informix then began more aggressively supporting NT by releasing new versions of Dynamic Server aimed at NT niches. For example, Informix released a new high-end version of Dynamic Server for NT that supports Microsoft Cluster Server in July. In August, Informix released Informix Dynamic Server Personal Edition, a single-user database server for NT, Windows 95, and Win98 desktops. Informix plans to offer NT versions of Decision Frontier Solution Suite and INFORMIX- Enterprise Command Center 4.0 when it releases these products in October.

Informix customers might be largely behind Informix's renewed interest in NT. Visits to the NT corner in Informix Developer Network (IDN) are increasing, as are small-business customers' requests for more NT support. (For more information about IDN, go to http://www.informix.com/idn.)

Although customers want to see more NT-based products, Informix faces strongly entrenched competition, mainly from Oracle and Microsoft. For example, INFORMIX-Enterprise Command Center 4.0 will have to compete with Oracle's and Microsoft's Enterprise Managers. How committed Informix really is to the low-margin NT market remains to be seen, given that the company's stated strategies are to continue to focus on its traditional strength in the high-end OLTP market and to offer high-end Web content management and data warehousing solutions.


FTC Is Helping Keep Alpha Alive
When Compaq Computer acquired Digital Equipment earlier this year, many analysts predicted a quick death for Digital's much maligned Alpha CPU platform. However, the Federal Trade Commission's (FTC's) intervention and Compaq's renewed interest in the ultra high-end workstation market is keeping the highly regarded RISC processor alive—at least for now.

Compaq couldn't dump Alpha if it tried. The consent decree that FTC imposed on Intel as part of Intel's patent infringement settlement with Digital compels Intel to help Digital (or its subsequent parent company) to continue the Alpha product line, even if Intel must produce Alpha chips directly. The agreement also calls for Advanced Micro Devices (AMD) and IBM to produce the chips as a hedge against Intel gaining a monopoly over yet another CPU architecture. To date, neither AMD nor IBM has committed to developing the chip. However, rumors are circulating that IBM is close to signing a deal with Compaq to produce a copper-based version of the chip.

What seems more certain is that the bulk of Alpha chip production might move offshore. Both Intel and Compaq are negotiating with Korean semiconductor maker Samsung Electronics to shift responsibility for volume fabrication to facilities in Asia. Samsung has been producing the Alpha chip in small quantities for years, supplying much of the Alpha-powered OEM channel. In June, Samsung entered into a joint partnership with Compaq to promote the technology and solidify the chip's image with the public.

Compaq plans to position its forthcoming high-performance Alpha-based workstations to compete primarily with other high-end graphics workstations. For example, the new XP series workstation will be priced in the $5000 to $10,000 range, despite the fact that Compaq will use off-the-shelf components throughout much of the system. Compaq hopes that the inclusion of its proprietary Highly Parallel System Architecture (HPSA) and PowerStorm graphics subsystem will differentiate the XP series workstation from its lower-end Pentium II processor and Merced systems.

However, Compaq's new SP line of Xeon workstations will also sport HPSA and PowerStorm, so how well Compaq can execute such a strategy remains to be seen. Other workstation vendors, such as Intel, are having increasing difficulties differentiating their midrange offerings from their top-of-the-line systems. (For an example, see "Workstation vs. High-End PC Hardware," page 44.) Compaq will likely encounter the same difficulties.

If you're thinking about purchasing an Alpha-based system, you might be wondering whether the purchase is prudent. With all the competing products and with the FTC playing chaperone, the Alpha platform will probably remain viable for some time to come. However, perhaps a more relevant question is: Why do you want an Alpha? With Xeon-based workstations just around the corner and with Compaq and other vendors migrating their proprietary performance subsystems to both platforms, perhaps waiting for Xeon is more prudent than buying an alpha.


Cubix Targets MetaFrame with Load-Balancing Software
A new competitor has emerged to challenge Citrix Systems' dominance in the thin-client/server computing space. Cubix, developer of the Balanced Cluster Service (BCS) for Windows NT Server 4.0, Terminal Server Edition, is preparing to release the first version of its load-balancing software.

Positioned as an alternative to Citrix Systems' MetaFrame product suite, BCS provides basic load-balancing services across two or more Terminal Servers. It uses the standard Windows NT service architecture and works with Terminal Server's native Remote Desktop Protocol (RDP). More important, BCS's price tag is thousands of dollars less than an equivalent MetaFrame configuration.

For example, at press time, the Citrix Load Balancing Option Pack costs $1495 per server and requires that you install MetaFrame ($4995 per server) and Terminal Server ($1129 for 10 users). In contrast, BCS costs $695 per server, plus the cost of Terminal Server and the Cubix management console ($995 per managed cluster). So for a typical two-node cluster, the price difference between the two offerings is more than $10,000.

To be fair, MetaFrame does more than just balance loads. It has other noteworthy features, including support for session shadowing via the Independent Computing Architecture (ICA) protocol and the ability to connect non-Windows clients. However, Microsoft's recent actions are negating the advantages of these features. Microsoft has recently committed to reproducing all of ICA's functionality under RDP. And the ability to connect users running non-Microsoft platforms (especially Web-based users via Java or a plug-in) has become less attractive because of Microsoft's new requirement that you must purchase an NT Workstation license for every connecting client.

The only real advantage that Citrix Systems has is MetaFrame's track record. Citrix Load Balancing Option Pack is a mature technology that users have field tested on WinFrame 1.7 for more than a year. Cubix is a newcomer and doesn't have the brand recognition or customer success stories that Citrix can call on to drive sales. Still, the significant price difference between the two products might prove to be a strong incentive.

New Moon Targets Office 97
How do you drive interest in a radical new technology without giving away the family jewels? Produce a scaled-down version and offer it at a bargain-basement price. That's the formula that distributed computing startup New Moon Software is banking on. The company recently released Liftoff for Office, a limited version of its revolutionary Liftoff API redirection technology for Windows NT.

New Moon's Liftoff technology redirects Win32 GUI API calls from an application executing on the Liftoff server to an NT or Windows 95 PC running the company's proprietary Liftoff client component. The result of this redirection technology is a thin-client type of relationship, with the bulk of the application code executing on the server. The client handles only the GUI API calls.

The full-scale Liftoff product supports several dozen commercially available Win32 applications and provides dynamic load balancing of a Liftoff server cluster. Liftoff for Office supports only Office 97 suite components and Internet Explorer (IE) and doesn't provide load balancing. So why would anyone buy Liftoff for Office? With a suggested retail price of $995 for unlimited use, Liftoff for Office lets organizations test the technology without blowing their budgets. (At press time, the full-scale Liftoff product retails for $4995 for 10 users.)

Although the Liftoff products are competing with thin-client Windows technologies (such as Windows NT Server 4.0, Terminal Server Edition and Citrix Systems' MetaFrame), New Moon is trying to distance Liftoff from thin-client solutions. "We're deliberately repositioning Liftoff as more of a networkcentric computing solution," said Frank Mara, New Moon's vice president of marketing.

According to Mara, New Moon envisions Liftoff evolving into a kind of universal client integration platform that seamlessly integrates disparate back-end data sources at the client desktop, without requiring the customer to install and maintain different client software packages.


Workstation vs. High-End PC Hardware
What's in a name? Not very much when you are faced with choosing between a hardware vendor's workstation-class PC offerings and its generic high-end value line PCs.

Workstation-class PCs are higher-end PC configurations marketed specifically for power users and graphics artists and developers. They feature more robust disk subsystems (SCSI-2 or SCSI-3 vs. EIDE) and application-specific video capabilities (such as OpenGL acceleration and realtime 3-D modeling). They typically ship with Windows NT Workstation 4.0 as the default operating system (OS) selection. In fact, most vendors market their workstation-class PC offerings specifically to customers wanting to deploy NT as a client OS.

But many IS customers are beginning to question the value of workstation-class PCs. With Intel slashing prices on its high-end Pentium II processor products at an unprecedented rate, the once predictable PC price-performance curve has been sent into a tailspin. (For more information, see "Yet Another Xeon Delay," page 38, and "FTC Is Helping Keep Alpha Alive," page 40.) CPU price drops that used to take months to materialize now take just weeks. And last quarter's high-end PC is this quarter's mid-range volume desktop. Both trends are putting the squeeze on the workstation-class PC market by forcing most second- and third-tier vendors to discount their value-line PCs to the point where their workstation-class PCs are beginning to look suspiciously overpriced.

Consider, for example, the single-processor PC category, in which the distinction between 400MHz workstation-class PCs and their 400MHz value-line counterparts is getting blurry. At press time, a Dell Precision Workstation 410 with a 400MHz Pentium II processor, 64MB of Error-Correcting Code (ECC) RAM, a 4GB Ultra-Wide SCSI-2 disk, 3Dlabs' PERMEDIA 2 video card and 21" monitor costs $4052. That price is typical for workstation-class offerings.

However, Dell's Dimension XPS-400 value system, outfitted with the same disk but a somewhat lesser (in terms of OpenGL performance) video card and monitor (Ultrascan vs. Trinitron) costs $3101. Almost $1000 savings is a big chunk of change, especially given that all other components (RAM, disk, and CPU) in the two systems are identical. The difference is enough to cause some NT Workstation users to ignore vendors' pitches for a workstation-class PC and to go with the lower-cost solution instead.

So why would someone buy a workstation-class PC? According to Dell, the answer is expandability.

All of Dell's Precision systems are dual-processor capable and can host more than 512MB of RAM. Plus, they ship with OpenGL accelerators that are tuned for NT.

However, this accelerator is a relatively low-end card, even by PC standards. Although the Diamond FireGL 1000 Pro supports OpenGL rendering at 1 million triangles per second, its general-purpose business graphics benchmark scores take a backseat to the newer Intel i740-based Accelerated Graphics Port (AGP) card. To add insult to injury, you can purchase the FireGL for less than $200. This pricing structure means you can retrofit the lower-priced Dell Dimension with the Diamond card and still save $700.

As for multiprocessing and greater RAM expansion, conventional wisdom suggests that few mainstream usage scenarios require such features. Client-side symmetric multiprocessing (SMP) has yet to prove its worth outside of heavy-duty image editing or 3-D modeling. Even the most demanding workstation-class scenarios rarely require more than 384MB of RAM, the current baseline for most Intel BX-based systems.

In most cases, workstation-class PCs simply don't offer enough genuine value over their comparably equipped PC counterparts to justify the price difference. Thus, if you want to migrate to a viable NT Workstation platform, you don't need to look beyond the top of your favorite vendor's value line. You'll get what you need to make NT perform well in a business environment. And you can invest the money you save in high-end workstation hardware to equip those users who truly need top-of-the-line performance.

Microsoft to Develop New Programming Standards
Faced with increasing pressure from the developer community to get its API house in order, Microsoft plans to develop three programming standards that center on core API-level disciplines. These standards are part of Microsoft's long-term strategic API vision, Distributed interNet Architecture (DNA). The three programming standards slated for development are COM+ (the next major release of Microsoft's distributed component object model—DCOM—architecture), Forms+ (an umbrella standard for various Microsoft HTML technologies, including XML), and Storage+ (the long awaited unification of the SQL Server, Exchange Server, and Windows NT storage subsystems).

Storage+, in particular, looks interesting. With Storage+, software developers will be able to more easily create applications that execute independently from the underlying data store.

Microsoft has already laid much of the groundwork for this standard in the Object Linking and Embedding Database (OLE DB) technology. Later this year, Microsoft plans to release an OLE DB Partners CD-ROM. This CD-ROM will include storage providers for a variety of popular platforms, including Exchange Server and Lotus Notes.

Will Microsoft's three new standards improve the quality of life for software developers? Probably not. But these standards will provide developers with clearly defined boundaries, and the standards will let them better understand Microsoft's long-term strategic vision when they are selecting APIs.


NT 5.1?
With the August release of Windows NT 5.0 beta 2, it seems as if Microsoft has actually announced NT 5.1. Beta 1 and beta 2 are that different from each other. Most, but not all, of the news is good.

Here's the bad news. Many network administrators want a Dynamic Host Configuration Protocol (DHCP) server that can understudy the main DHCP server, automatically issuing IP addresses from a given scope if the main DHCP server fails. This essential network fault tolerance isn't possible under NT 4.0, and Microsoft promised to fill that gap in NT 5.0. Microsoft has kept its promise, but at a high cost. NT 5.0's DHCP servers will provide fault-tolerant DHCP server services, but only in an NT cluster. NT 5.0, Enterprise Edition's high price tag will exclude many customers from getting the benefits of DHCP fault tolerance.

Here's the good news. NT 5.0 will include many new tools to simplify deploying NT. If you want to deploy NT 4.0 on many machines in a network, you must spend lots of time installing NT individually on each machine, purchase third-party tools to mass deploy NT, or follow the dangerous practice of developing one NT image and propagating it over the network. (This practice can create duplicate security IDs-SIDs-on different machines.) With NT 5.0, you can use SysPrep to prepare a preconfigured version of NT—copy of NT that is 99 percent installed, save for username and machine name. Then, you use a tool such as Ghost Software's GHOST to copy the disk image and a small amount of scripting information to users' machines. The first time users start their machines, the system adjusts the SIDs and personalizes the machines. The machines are ready to use in just a couple of minutes.

Another deployment improvement in NT 5.0 is that you can avoid the two-step process of first installing NT and then applying a service pack atop it. Instead, you simply incorporate the revised service pack files into the i386 installation files. Service packs will be easier to apply to already installed NT 5.0 systems as well. Service packs will be easier to push out to desktops and will not require reinstallation every time you change the system. Both of these capabilities are welcome improvements.

Keeping NT up and running will be simpler with NT 5.0. With NT 5.0, you can press F8 at boot-up to activate a number of safe boot modes similar to the ones in Windows 98 and Windows 95. And Microsoft claims that it has fixed 50 NT 4.0 bugs severe enough to cause blue screens, and Microsoft will even include some of these fixes in a future service pack for NT 4.0.

Some of the biggest changes in NT 5.0 concern how the mechanics of Zero Administration for Windows (ZAW) will work. The new ZAW plans call for tools that are far less network bandwidth- and server-intensive than the ones originally outlined more than a year ago. "We went around to clients and presented our vision of where we thought ZAW would go, and our clients told us, 'we're being asked to reduce the number of servers in our organization and you want us to put more in to make ZAW work?' So we rethought how to solve the problem," said Dan Plastina, ZAW Group Program manager.

In its new incarnation, ZAW focuses on a few specific tasks: getting a fresh copy of an operating system (OS) on a new or recycled machine, deploying applications to that machine, and rolling out new versions of applications to that machine. As before, IntelliMirror is the keystone technology for ZAW, but the specifics are changing. IntelliMirror is still a client-side technology that caches data from servers, but IntelliMirror isn't as automated as it was first conceived to be. Now, you must use Explorer's user interface (UI) to specify which files IntelliMirror needs to cache. (For an overview of ZAW and IntelliMirror, see Darren Mar-Elia, "Application Management in NT 5.0," April 1998, and Mark Minasi, "Zero Administration for Windows," December 1997. Information about IntelliMirror, ZAW, and other features in NT 5.0 beta 2.0 will appear in a future issue of Windows NT Magazine.)

In Microsoft's never-ending quest for the perfect UI, the company has added many features to NT 5.0's Explorer. One feature is My Network Places, a logical alternative to both mapping drives and managing Remote Access Service (RAS) connections. This folder replaces NT 4.0's Network Neighborhood and the dial-up phonebook. Microsoft has also added many of Win98's useful UI features, including greater use of auto-complete and a recently used documents view in the standard File Open dialog box.

Thousands of developers and systems administrators will receive NT 5.0 beta 2. Their input will help Microsoft move toward its next goal: NT 5.0 beta 3. NT 5.0 is clearly still vaporware­a work in progress. But previewing beta 2 and listening to the Microsoft officials speak about it gives you a strong feeling that the vapor is starting to condense.

TAGS: Security
Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish