Skip navigation

GNN1000 Host Adapter vs. a 10/100 Ethernet NIC

Fulfill your need for high-performance speed

Processor speeds and memory upgrades have grown with users' need for speed. Unfortunately, the 10Mbps and 100Mbps network adapters on most Windows NT Server systems can't keep up with the increased processor and memory speeds. Of particular concern are network interconnects, devices that replicate data between two servers. As a result of congested network interconnects, network servers sustain higher latency, increased CPU utilization, and slower network response times. GigaNet provides a solution to your congested-interconnect problems—its GNN1000 Host Adapter offers 1Gbps speed and an efficient new OS interface to help clean up your bottlenecks. The GNN1000 increases the speed at which data moves between systems and across the network without increasing latency and CPU utilization.

VIA
In 1998, Compaq, Intel, and Microsoft launched the Virtual Interface Architecture (VIA) initiative with the goals of enabling the creation of high-performance, low-cost NICs; gaining market acceptance for clustering solutions; and creating a technology-independent architecture. These companies hoped the VIA initiative would encourage other vendors to use VIA to create faster network technology and replace the slower network device interface specification (NDIS). The VIA initiative includes more than 100 vendors that are implementing VIA technology in their products.

GigaNet still uses NDIS in its products because of the lack of available VIA-enabled technology. A benefit of NDIS 4.0 is that it lets users bind multiple NDIS-compliant NICs to one protocol stack, multiple protocols to one NIC, or multiple protocols to multiple NICs. NDIS worked well in most networking situations; however, increased data demands exposed its limitations. NDIS's primary limitation is that all communication must travel through the OS kernel, which is quite large in the case of NT. This slow journey through thousands of lines of code hinders network speeds.

GigaNet created the GNN1000 as a VIA-enabled host adapter. The GNN1000 uses a VIA-enabled add-on to the adapter's NDIS driver. The VIA technology opens a path directly from the server's applications to the network wire, bypassing the OS. This configuration frees the processor to work on other procedures, rather than being tied up by network traffic congestion.

My Test Environment
To test the GNN1000, I used an HP 50" X 36" X 24" cabinet that houses two rack-mounted NetServer LPr servers, a 17" monitor, and eight 4.2GB hard disks. The keyboard, mouse, and monitor connect to one keyboard/video/mouse (KVM) switch, which lets me use one keyboard, mouse, and monitor to control both servers. Each of the servers contains one 400MHz processor with 128MB of RAM, a 4.2GB hard disk, and a CD-ROM drive. In addition, each system has an HP 10/100 PCI Ethernet card for network communication and a GNN1000.

I installed Vinca's Co-Standby Server 1.02 for NT clustering software on my test system, because GigaNet created the GNN1000 to target the data replication market. GigaNet partners with Vinca to provide a high-speed interconnect for Co-Standby Server. This product requires a dedicated interconnect for all data replication between the primary and secondary servers. (For my review of Co-Standby Server, see "Clustering Software for Your Network," July 1998.) After I installed Co-Standby Server, I configured four 4.2GB clustered volumes and clustered one IP address between each system.

You install and configure the GNN1000 the same way you install other network adapters. After I installed the adapter on my primary and secondary systems, I was ready to begin my tests.

The Tests
I tested the GNN1000's data-mirroring speed and the HP 10/100 PCI Ethernet card's speed, to find out how fast each device replicated complete data between the primary system, AQUA-0, and the secondary system, VELVA-0. I also tested the speed at which these devices transferred new files between AQUA-0 and VELVA-0.

For the first test, I wrote a batch file that simulated disk activity between AQUA-0 and VELVA-0 using the Vinca link. This batch file contained instructions for AQUA-0 to delete all the clustered hard disks' content on AQUA-0 and VELVA-0 and simultaneously create 300MB files on each clustered hard disk. I ran the batch file from a command line on AQUA-0.

To set up the second test, I reinitialized each of the clustered hard disks in the Vinca console. This process mirrored the data from AQUA-0 to VELVA-0. When you reinitialize a hard disk, the primary system deletes the secondary system's data and rewrites it with data from the primary system. I used NT's Performance Monitor on VELVA-0 to measure the devices' performance in both tests.

On Your Mark, Get Set ...
I opened a command prompt on AQUA-0 and ran the batch file to start the first test. The 10/100 NICs served as the network connection, and the GNN1000 adapters were the Vinca link. Performance Monitor gathered data on VELVA-0, and 10 minutes later the test completed. I saved the log file, set up another log with the same settings, and started the log for the second test.

For the next test, I opened the Co-Standby Server console and expanded the Volumes folder. I right-clicked a clustered volume, selected Reinitialize from the volume's menu, and clicked OK. I repeated this process for each clustered volume. The software mirrored each clustered volume's data from AQUA-0 to VELVA-0 via the Vinca link. Performance Monitor logged the performance data within 5 minutes. When the test completed, I stopped the log.

Next, I reconfigured the network settings on AQUA-0 and VELVA-0 so that the two adapters swapped functions. Thus, the 10/100 NICs served as the Vinca link and the GNN1000 adapters served as the network connection. I rebooted AQUA-0 and VELVA-0 and reran the tests.

The Checkered Flag
I tabulated the four logs and analyzed the results of the two tests (Table 1 outlines the results of the first test, and Table 2 outlines the second test's results). I found that using the GNN1000 as your replication link provides substantial performance increases, even if you use NDIS 4.0 drivers.

The GNN1000's price is much higher than a standard 10/100 NIC's price. However, the performance increase and your users' satisfaction are reason enough to purchase the GNN1000.

GNN1000 Host Adapter
Contact:
GigaNet * 978-461-0402
Web: http://www.giganet.com
Price: $799
System Requirements: Windows NT Server or Workstation 4.0, 10MB of hard disk space, PCI-specification compliant video card, Available 32-bit or 64-bit PCI slot
Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish