Skip navigation

Datacenter Server Technology

The first large-scale enterprise service sneaks by us

As the Lab director for Windows 2000 Magazine, I spend a lot of time talking to product vendors. During a recent phone briefing, as I tried to separate product information from the marketing buzzwords the vendor was tossing at me, I had an epiphany. It came not as a vision of a heavenly chorus but rather as a flashback to an episode of the Muppet Show, in which Professor Bunsen Honeydew explains to the audience the mission of Muppet Labs: "Muppet Labs, Where the Future Is Tomorrow."

The truth of that statement is what makes it both ironic and humorous. But truth doesn't necessarily make something important or even useful. Sometimes, the important things are right in front of you, and it takes an outside event to make you realize the blindingly obvious.

In Search of the Obvious
The blindingly obvious thought I was missing before that phone call might not have been quite as obvious as it seems to me now. In the computer industry, and here at Windows 2000 Magazine, all sorts of services have been a topic of conversation for a while. We publish an online newsletter that highlights application service providers (ASPs), and we've written dozens of articles and commentaries about the future of enterprise services and their importance to our readers. Well, the first large-scale enterprise service from Microsoft has shipped, and nobody even noticed.

I know what you're thinking: "It wasn't an epiphany he had, it was a stroke. What the heck is he talking about?" Let's take a look at what defines an enterprise service. First, you shouldn't be required to personally support an enterprise service. An enterprise service should be reliable, with some level of contractual reliability; it should support your applications; and it should offer you some combination of performance, reliability, and availability that isn't easily accessible to you as an individual customer.

In September, Microsoft shipped a product that fits these criteria and more. That product is Windows 2000 Datacenter Server. Of course, Datacenter isn't really a shrink-wrapped product. You can't walk into a retail store and buy it. You can't even call the corporate sales rep at your favorite server vendor and have a system shipped to you. You buy Datacenter as a service from your system's vendor.

The Datacenter license agreement just lets a customer purchase the software preinstalled on a limited set of certified server products, but that setup isn't how Microsoft or the Datacenter OEMs picture the future. Customer support options are what will make Datacenter most appealing. Without a support agreement, you get software and a very expensive piece of hardware—and the headache that comes with making sure everything works every time you change your server configuration. With a support agreement, you get a 99.9 percent uptime guarantee, a direct conduit into high-priority system software support from Microsoft, hardware and software replacement guarantees that can't exceed a 4-hour response time, and a guarantee that your applications will work.

Although I'm writing this column before Datacenter's release announcement, I'm confident not only that OEMs will make the guarantees I mention (to get Microsoft certification, OEMs have to) but that you'll also see vendors vying to run your Datacenter server at their location. This approach—effectively, a colocated Datacenter server—lets vendors give you the highest level of service and makes Datacenter just another network service for you and your users. Voilà! The first large-scale enterprise service. And it just snuck up on you.

Applying Datacenter Technology
Completely unrelated to Datacenter's future as a network service is the fact that Datacenter contains new technologies that, although useful in the Datacenter product, Microsoft needs to add to the less-exalted Win2K Server versions. I'm not talking about 32-processor SMP or 64GB memory support, or even the 4-way failover clusters (although that last one would be nice). I'm talking about two completely new features: Winsock Direct and the Process Control Tool.

Winsock Direct for System Area Networks (SANs) provides extensions to the Winsock interface that let any Winsock-compliant application access SANs directly. Therefore, if your company has invested in SANs for storage, backup, or any other reason, any Winsock application running on TCP/IP can transparently access the SAN resources. Although SANs don't generally use TCP/IP for transport, using Winsock Direct as the application interface means that little, if any, custom coding is necessary. Given the level of aggravation we've encountered in the Lab when trying to get SAN systems running, we know that anything that can simplify the process should appeal to anyone planning a SAN implementation. Broader availability of the Winsock Direct technology should also benefit SAN providers wanting to expand their reach.

How much time have you wasted trying to figure out why a smoothly running server suddenly experiences resource problems, with some unknown application sucking up CPU cycles or memory? Using Task Manager provides only bare-bones information. Even the process viewer from the Microsoft resource kits only gives you the option to manually kill a process you think might be out of control. Win2K Server adds an extension to the process model called the Job Object, a kernel object you use to gather related processes to manage them as a group. In the server's lesser versions, you can use the Job Object API to manipulate the objects, a capability that lets you build applications to manage and manipulate Job Objects.

Datacenter adds the Process Control Tool, a Microsoft Management Console (MMC) snap-in that gives you detailed control over Job Objects and lets you assign specific resources to specific applications to fine-tune the server. You can create rules that prevent objects from exceeding specified resource limits. For example, you can prevent runaway system processes from bringing critical applications to their knees because of, say, a poorly coded application that doesn't release heap memory correctly and keeps claiming more memory until server performance is destroyed. You can use the Process Control Tool to perform nine tasks: allocate machine resources, organize processes into process groups, assign processor affinity, assign scheduling priority, assign working set limits, enforce user-CPU time and memory limits, limit the number of active processes, display statistics, and define the rules that the Process Control service will apply dynamically. All of these tasks give you a previously unheard of level of control over server applications.

Both Winsock Direct and the Process Control Tool clearly need to become standard parts of the Win2K Server product set. Microsoft shouldn't reserve them for the crowning jewel of the product line, especially given the limited distribution that Datacenter will have relative to the number of Win2K Server and Win2K Advanced Server systems that will be in production use.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish