Skip navigation

Server-Based Computing in 2000

Server-based computing in the Windows world has, to date, been a niche technology, but for good reason. Because multiuser Windows has been available only as a separate package—and not a well-supported one—implementing terminal services on a corporate network has been expensive and risky, even before you consider the complexities of server-side administration. However, certain events and continuing trends in 2000 will make server-based computing more widespread. It’s still going to be a niche technology in 2000, but I expect it to occupy a larger niche than it did in 1998 and 1999.

Predictions for 2000
Server-based computing won’t take off explosively this year, but the groundwork that the industry laid and built on in 1999 will encourage the technology's evolution from its niche position to a full component of many enterprises. In time, server-based computing will become a popular way to deliver applications to certain classes of end-users. What follows are my predictions for 2000.

Windows 2000 will encourage the spread of terminal services. We don't need a crystal ball to see this one. The most obvious push toward wider server-based computing use in 2000 is Microsoft's inclusion of terminal services in Windows 2000 (Win2K). Of course, this isn’t the first time that a server OS has included multiuser capabilitiesit isn’t even the first time that a variant of Windows has included multiuser capabilities. However, it's the first time that the quintessential standalone OS designed for the standalone PC includes terminal services in the core product, a development that’s significant not only because it reflects Microsoft’s acknowledgement of the technology’s desirability, but because the inclusion reduces the risk and cost of testing terminal services. Instead of buying a separate product to experiment with terminal services, people running Win2K can try it out simply by enabling a service and installing the RDP client on a network client. Even if people aren’t interested in using terminal services at first for supporting applications, they can try out the remote administration mode of terminal services. Look for growth throughout 2000 as people upgrade servers and try out the features in the new OS.

Thin clients will become more multifunctional, sophisticated, and smaller. Today, most thin clients are still PCs, not terminals or handheld PCs (H/PCs). Windows terminals (i.e., terminals designed to display a Windows environment; not all Windows terminals run a form of Windows in their local OS) make up just 30 percent of desktop machines in server-based computing environments. One obvious reason for this is that people already have PCs, so even if they plan to use terminal services, they don’t need to buy new client equipment. Also, the sheer complexity of setting up terminal services in a company that doesn’t already use it makes thin-client devices. As Wyse’s Jeff Macnaughton explained it, when Wyse started marketing its thin-client devices, the terminals were a hard sell. The company had to explain WinFrame to people who were already familiar with terminals, and it had to explain terminal services to those who were already familiar with Windows. With Windows 2000 Server Terminal Services already on the premises, Macnaughton expects Windows terminals to become an easier sell.

That said, I don’t expect the Windows terminal market to explode in 2000—not in any way that’s going to displace the PC. Although people will use terminal services more widely in 2000 as they move to Win2K servers, the proportion of Windows terminals in use won’t change much, even if the actual numbers rise as International Data Corporation (IDC) expects them to. What will happen to Windows terminals in 2000 is that they’ll become more functionally diverse. As of early 2000, most Windows terminals on the market are pretty much alike. Some have more advanced management capabilities, some have better video or a more powerful CPU, but the basic function of most remains the same: Using a variety of terminal connection types, a Windows terminal lets you connect to a terminal server and remotely manipulate applications and data.

This should change in 2000. In 1999, Microsoft announced a spec for more powerful Windows terminals, called Windows Terminal Services Professional, which uses NT Embedded as the terminal’s local OS and gives you the option of installing a browser (not necessarily Internet ExplorerIEincidentally) and Media Player. According to Dave Pollen of Microsoft, the Windows Terminal Services Standard will also change to incorporate a Windows CE-compatible version of IE 4.0 and support Dynamic HTML (DHTML), with the first devices based on this specification released in April. Not all Windows terminals will include local browser support—it's an option, not a requirement—but Microsoft’s inclusion of local applications on the terminal indicates its support for making the devices more like sealed-case PCs. Oddly, such developments clear the way for the terminals to eventually become independent of terminal servers. If terminals have browsers and can connect to the Internet, they can run applications published on the Internet.

The second trend for Windows terminals in 2000 is that they’ll become less visible. Let's face it, many end users think of the monitor as the computer, because that’s what they see. Noting this, terminal manufacturers have experimented with designs that add the terminal capabilities to a device that’s ostensibly a monitor. It's a nice model for environments that don’t have room for separate terminals, but its drawback is that you have to replace the monitor if you replace the terminal, and vice versa. To accommodate both needs, OEMs will find ways to make a terminal in a form factor that lets it occupy no additional room and avoid the problems of the combined devices. The monitors won’t get smaller—no one wants to surf the Web on a watch face—but the terminals themselves will.

MetaFrame will retain its supremacy in the Windows terminal services world. Citrix, which launched multiuser Windows, has ruled the thin-client world for years. However, Win2K includes terminal services, NCD has put together an impressive suite of RDP-capable helper tools, and Sun Microsystems has released an RDP-supporting Tarantella II product that’s compatible with both UNIX and Windows Terminal Services. Is MetaFrame’s supremacy in jeopardy?

Probably not. Most enterprise users of terminal services use MetaFrame to supplement Windows Terminal Services. (David Friedlander of Giga puts MetaFrame users at 90 percent of Windows NT Server 4.0, Terminal Server EditionTSE users; Dave Pollen of Microsoft estimates the proportion to be about 70 percent, conceding that most enterprise customers use MetaFrame.) I don't expect the proportion to change much this year. First, MetaFrame has a large installed base. Once you've gone to the trouble and expense of deploying terminal services and have it up and running, why change to a different platform? Even SCO’s 1-:1 trade-in deal for MetaFrame licenses will have a hard time outweighing the costs of retraining and retweaking applications. Second, MetaFrame has some features that Windows Terminal Services doesn't: seamless individual application publishing, support for server farms (logical groupings of servers to look like one connection, so users don’t have to know which server is publishing a particular application), and Web application publishing. And until the non-Windows RDP clients mature, I’d rather use ICA than RDP to connect to non-Windows platforms such as UNIX workstations. They’re compatible with Microsoft’s RDP, but they’re incomplete at present, supporting only the RDP 4 set of functions, and not working as well as ICA.

What would hurt MetaFrame would be the inclusion of a suite of tools such as NCD’s ThinPath series in the base Windows product. Although such a suite wouldn’t kill MetaFrame, it would give customers, especially smaller customers, fewer reasons to buy a third-party product. Although not as well-known as Citrix in the terminal services world, NCD has been doing multiuser Windows for years, having licensed the WinFrame technology from Citrix to create Multi-Win. Many more people are using MetaFrame, but the NCD tools that I’ve seen, while restricted to PC and NCD ThinStar clients, aren’t bad and would be a valuable addition to Windows Terminal Services if Microsoft bought the technology and made it terminal-independent. It’s a logical suggestion; I seriously doubt that I’m putting any ideas into Microsoft’s or NCD’s heads.

That said, MetaFrame remains the most viable helper option for Windows Terminal Services for the time being. It’s too soon to see how Tarantella II will work in practice. NCD’s ThinPath servicesalthough as RDP-compatible as MetaFrame is notrequires PC or NCD ThinStar clients, which reduces their desirability by locking potential customers into a single terminal type (albeit a pretty good terminal). And although Win2K’s terminal services includes more management tools than TSE, it still doesn’t have all the functionality that you need to run terminal services in the enterprise. Citrix made the terminal services market viable, which will encourage competition, but if you use Windows terminal services, you’re probably going to use Citrixat least until late in 2000

Application service providers will become more popular for task-oriented applications in the commercial sector but won't replace many desktops. In 1999, interest in the application service provider (ASP) model, through which third parties provide applications to corporate customers via a WAN, began to grow. Right now, the ASP industry is in the hype phase. As Matt Dircks of Citrix puts it, when a new technology appears, there’s initially a lot of interest, then a backlash as people realize that the technology can’t be everything to all people, and finally a return to reality as people implement the technology appropriately. We saw this pattern with terminal services, and we're seeing it now with ASPs.

For ASPs to take off, though, a couple of things have to happen. First, it’s got to become easier to find ASP services. Second, ASPs have to provide true end-to-end service.

The first order of business is to streamline the market a bit, because right now it’s hard for customers and ASPs to find each other. Most ASP customers are small or mid-sized businesses, and the ASPs aren’t all that big either. The market will become streamlined in a couple of ways. First, it’s likely that a good percentage of the initial players won’t last the year, either becoming absorbed into other companies (e.g., Breakaway's absorption of Eggrrock) or simply going out of business. Second, the elements of application service provision (independent software vendors--ISVs, application hosting, data storage, and so forth) will continue to create stable partnerships. Even entire ASPs will partner to become accessible to customers through a portal alongside other ASP offerings. That’s all going to make life easier for the customers because they’ll have fewer providers to sift through, but it’s also going to make life easier on the ASPs that partner in this way because they'll be able to reach a broader audience.

Second, while they’re partnering to provide applications and data storage, the ASPs will have to remember that if the point of going with an ASP is to reduce the costs of application and network administration, then ASPs—or an ASP partner—will have to make sure that they take care of everything. Right now, that isn’t always the case. I recently asked several ASPs about service: Who does a subscriber call if something goes wrong with the application or the connection? One ASP’s not-atypical answer was that because the applications run on the server, there’s no way that something can go wrong on the client end. If only that were true. As Keith Gaylord of IBM.put it, “End to end means delivery to customer premise. However, support that ends at the network doesn’t help the customer when the screen goes black and you have no one to call.” ASPs have to accept responsibility for what happens on the client side of the connection as well as at the server and network portions, especially because the largest pool of potential ASP customers are e-commerce companies and startups that might not have money or the talent to support a local IT department.

If ASPs are going to take off, they need to follow a full-service model with one point of contact for any problem. In 2000 and beyond, most ASPs are likely to follow a vertical model, wherein many partners provide a service (network, application delivery, data storage, user account management, etc.) based on their core competencies, instead of the horizontal model, wherein one entity provides for all services. However, as Gaylord puts it, ASPs will have to look like one provider while being a collection of partners.

Long-Term Trends

So what will happen after 2000? Server-based computing has the potential to push three main trends in computing: greater platform independence more emphasis on the Web as a development platform, and more emphasis on service over equipment, even in the consumer sector.

Increased interoperability. We can already see evidence of increased interoperability. One of the biggest headaches about a diverse computing environment is interoperability. For this reason, many organizations select the one-OS-only model, becoming, for example, a Microsoft shop or UNIX shop. However, the one-OS-only model has its drawbacks because some applications and versions are available only for a particular OS, or one OS is better at a particular task than the “standard” OS is. Using server-based computing and display protocols means that you can run one set of applications on a local platform but still access other application types, without using slow translating techniques. One MetaFrame customer I know runs powerful modeling software on a UNIX client but needs to access Microsoft business applications. Rather than give developers two computers, the customer runs the Microsoft applications from a TSE server and publishes them for the developers to access via ICA.

Broader use of the Web browser as a desktop environment. The graphical desktop environment has become so ubiquitous that it’s possible to forget that it doesn’t have to be that way. For years, Windows has been the development platform of choicebut that’s changing. According to The Gartner Group, 30 percent of application development is now for Web-based applications. By 2001, many expect this figure to surpass 50 percent. This prediction doesn’t mean that 50 percent of applications in use will be Web-based, or that the most common productivity applications won’t remain on the desktop. But increases in development for the Web indicates a growth in task-oriented applications and interfaces for server applications, just as Web email provides an interface to the mail-server applications. The rationale is simplicity of interface. I don’t know about the Macintosh world, but I do know that those supporting Windows environments expend an awful lot of energy locking down the desktop. Even if users need access to only a couple of applications, there’s still a lot of tools to mess around with. Using a Web interface such as Citrix’s NFuse's, the “desktop” can be a Web page with application icons in it.

Citrix is affirming its dedication to providing applications via both ICA and the Web, but eventually most people who need applications and not access to a desktop will use Web-published applications, whether the applications were developed for the Web or for Windows and then modified for viewing with a browser. Basically, it’s a change from the PC-as-world outlook to the PC as task-oriented appliance.

Computing as service commodity. Eventually, I expect to see end-user computing—even home-user computing—change its emphasis from the box on the desk to the applications and data available to you, wherever you are. It’s like the old argument about backups: The important item isn't that putty-colored box on your desk, it’s the data inside it. In this new computing model, your computer will be the least important part of the computing experience. Whether you’re on the road, at home, or in the office, you can dial in to your provider and download your data, your personal desktop settings (if you use a desktop), and your applications. This model is the ultimate example of server-based computing, when it can provide applications and data not only to people via in-house terminal services.

Sounds crazy, perhaps. But when telephones first came into widespread use, people would have one telephone on a telephone table in the house. If you wanted to talk on the telephone, you sat at the table. As time went on, people got more telephones, so they weren’t restricted to a certain room. Then the telephone cords became longer, so you could move about and talk at the same time. Then people got cordless telephones, so they could use the devices without having to be near the telephone connection at all. Finally, mobile telephones became popular, and you can now use a telephone almost anywhere. I expect computing to follow a similar pattern. Server-based computing may take off faster in less-developed countries because of the infrastructure shortage and the smaller amount of existing legacy machines to support. Just as countries without extensive land-line networks came to rely on cell phones to provide telephone service, countries and areas without extensive PC networks are more likely to use server-based computing to give people access to their data.

That’s a while from now. Don’t look for application access as part of your ISPs’ offerings any time this year and probably not next. High-speed access isn’t sufficiently widely available yet, and the connections that are available aren’t always reliable enough to provide the portal to mission-critical applications. Licensing and deployment methods are also an issueASPs and ISVs are still working out how best to deploy applications to corporate customers. Eventually, though, I suspect that many ISPs will provide at least some application publishing and data storage services. This approach will make bringing computing to the masses on a truly global scale easier, instead of restricting this access to the most urban and to those who are either technologically savvy or have access to people who are.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish