Skip navigation

Running Them Up the Flagpole

Performance testing these Web server packages was a little simpler, logistically speaking, than analyzing their user-appreciation attributes (although, we had problems with some products not working with our test). We architected a test that represented medium traffic on a typical Web server. Such a system serves HTML pages, text files, and images, rather than supporting video-streaming, CGI programs, and so forth. This approach made the tests easier and faster to run and represents how some people will use the packages. Our test characterizes average performance from each server package and demonstrated differences in the software.

We ported Silicon Graphics' WebStone 2.0 to Windows NT, tweaked it a little, and ran it on a four-system setup in the Windows NT Magazine Lab, all on an isolated 10 Base T Ethernet network. The setup consisted of a Primary Domain Controller (PDC), the Web server, and two client systems dividing the simulated user access load. We used an NEC RISC-Server 2200 as the PDC, configured with a single 200MHz MIPS R4400, 64MB of RAM, and a 2GB SCSI disk (look for a review of this server in the October issue). We installed NT 4.0 beta 2 running DNS.

The Web server (on which we ran all the Web software packages) was an Intergraph InterServe Web-300, with a 150MHz Pentium Pro CPU, 64MB of RAM, and two 1GB fast SCSI-2 drives (for a review of this server, see Joel Sloss, "Serving with Style," on page 45). This system ran NT Server 3.51 with Service Pack 4.

A Micron Millennia and a Canon Innova Pro 5400ST were the clients. We configured each system with NT Workstation 3.51 and Service Pack 4, and installed Remote Shell and Executive services.

We configured the tests with runs of 20, 25, 30, and 35 client sessions distributed between the two workstation systems. Each run lasted 10 minutes, and we repeated each three times. The clients retrieved a 5120-byte file (which they used 50% of the time), a 500-byte file (35% of the time), and a 51,200-byte file (15% of the time) from the Web server. Each test took two hours to complete.

To ensure that the test ran properly and that nothing influenced the server packages or other containable factors, we reinstalled NT Server for each run by booting the Web server to NT Workstation and copying a clean version of Server back to the root directory. This way, we restored the Registry and system files to their original state for each package.

We extracted some meaningful results from the test by evaluating several factors for each server package: server connection rate (connections per second), server throughput (Mbits per second­Mbps), average response time (in seconds), average client throughput (Kbits per second­Kbps), and total number of pages read. Although we attempted to test all 15 Web servers, only 10 servers worked within our test criteria. Table A shows the statistical results and Figure A shows the connection rates for the server packages we tested. We relied on server connection rate for performance characteristics to test each package's capacity under this test. The other values show what Web users can expect from your system when you run each package.

Microsoft's IIS 2.0 edged out Netscape's SuiteSpot (and FastTrack Server 2.0) and Internet Factory's Commerce Builder Pro 1.51. Tight integration with NT contributed to this winning performance.

Design also had a lot of impact on how each package performed during testing. Software such as IIS 2.0 showed little effect on the Web server's CPU utilization during the test (with barely 40% average utilization), whereas other packages such as WebSTAR NT/95 from Quarterdeck, maxed out the CPU. Quarterdeck pulled WebSTAR from the market before we finished the review, which is why it was not included.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish