Skip navigation

Optimizing Performance over Slow Connections

Fulfill your Web site visitors' performance expectations

Despite optimistic predictions about ubiquitous broadband Internet access, a huge percentage of Internet users still access your Web site at a pathetic 56Kbps. In fact, I'd wager that nearly 50 percent of the user population visits sites at substantially less-than-snappy DSL or cable-modem speeds. In the late 1990s, the erstwhile Zona Research published an oft-quoted study that showed that most users will leave a Web site if their requests aren't satisfied within 8 seconds. That might seem like a long time, but you'd probably be surprised at how many sites fail to fulfill that expectation. To make sure your site isn't one of them, you need to understand the page-design factors that affect performance, develop techniques to improve your site's performance, and regularly monitor and measure that performance.

How Site Design Affects Performance
End-to-end Web site performance is a function of several factors besides how a page is designed. For example, if a typical multi-tiered application infrastructure sits behind your Web servers, the time required for something to appear on the user's desktop will be the sum of the times that each tier (e.g., application server, database) takes to satisfy its portion of the user's request. Most multi-tiered Web applications are composed of many different third-party products (e.g., business logic based on Java 2 Enterprise Edition—J2EE—servers, Active Server Pages—ASP, or Microsoft SQL Server or Oracle databases), but tuning response time for application-server and database tiers is outside the scope of this article. I concentrate on the things you can do in the "presentation" tier (i.e., designing the pages that users see in their Web browsers) to ensure that, as long as the rest of your application is humming, users will have a good experience.

Web pages that contain a lot of objects (e.g., .gif files, frames, style sheets) are slower to display than those with fewer objects. Add Secure Sockets Layer (SSL) and its handshaking process to the mix, and you further increase both the time pages take to download and the processing horsepower needed to serve those pages. In particular, pages with frames are quite slow because the browser treats each frame as a different object that requires a new connection. If you're building a page with performance in mind, less is more: The fewer objects on a page, the quicker that page will arrive at the user's desktop. Fewer objects correspond to fewer TCP connections that the browser must create and maintain. When you can, combine multiple objects that are spatially close together on a Web page into one image file.

Object size also can affect performance. TCP/IP packets typically hold 1460 bytes of data in addition to the packet header. So, if a .gif file on your page is 1461 bytes in size, an extra round-trip between the browser and your Web server is required to bring down that extra byte. To minimize round-trips between your Web server and the user's browser, try to limit the size of objects to multiples of 1460 bytes or combine small objects into one object that doesn't exceed that size.

Another aspect of page and site design that affects performance is whether your pages are static or dynamic and whether static objects and pages are set to expire immediately or after an interval. Modern e-commerce sites are serving up increasing percentages of dynamic content. From a user's perspective, having a lot of dynamic content is good. However, from a performance perspective, dynamic content presents some challenges. A benefit of static HTML content is that you can cache it at various points between your Web infrastructure and clients. Reverse proxies, forward proxies, and browser caches can typically handle caching of static content. When a page is cached, clients don't have to get the page from the Web server each time they request it, so pages appear quickly. However, because dynamic content is just that—dynamic—it doesn't benefit from caching unless you use a product such as FineGround Networks' FineGround Condenser Acceleration Suite 4.0, which uses a caching approach to speed dynamic-content delivery (I discuss FineGround Condenser Acceleration Suite later). However, you can use some techniques and third-party products, which I also discuss later, to optimize delivery of dynamic content. If your dynamic site's performance is suffering, you should consider one of those approaches.

Everything I've mentioned so far illustrates the tension between building rich Web content designed to attract viewers and presenting that content speedily to users who visit your site through "bandwidth-challenged" connections. Let's look at some techniques you can use to optimize Web page delivery over slow connections.

Using Compression
One easy solution for speeding delivery of static HTML content won't cost you a dime, yet you might not know about it. Gzip is a standards-based compression protocol that most Web servers (including IIS, Apache, and Sun Microsystems' Sun ONE Web Server—formerly iPlanet Web Server, Enterprise Edition) support. Gzip lets you send compressed static .html files to the browser, which automatically expands and renders them. Both Microsoft Internet Explorer (IE) and Netscape browsers support gzip decompression natively (for gzip compression to be recognized, IE requires that you enable HTTP 1.1).

Enabling gzip compression of static HTML files and dynamic ASP-based applications in IIS 5.0 is straightforward. On your Windows 2000 server, click Start, Programs, Administrative Tools, then select the Microsoft Management Console (MMC) Internet Services Manager snap-in. Within the snap-in, right-click the server's name and select Properties. In the Master Properties dialog box, select WWW Service, click Edit, then choose the Service tab. As Figure 1 shows, the HTTP Compression section lets you enable and disable compression for both static and dynamic pages (i.e., .asp files).

If you want to compress only static HTML pages, select Compress static files. To compress ASP pages as well, also select Compress application files. By default, IIS caches compressed static pages in \%systemroot%\IIS Temporary Compressed Files, although you can change that path. You can also tell IIS to limit the size of the temporary folder to prevent cached compressed pages from filling up your disk. Because dynamic files change frequently, compressed ASP pages aren't cached in the temporary folder. After enabling compression, restart IIS for the change to take effect.

Figure 2 and Figure 3 illustrate the benefits of gzip compression. Figure 2 shows a Page Detailer trace of a Web page that I downloaded before I implemented compression (I describe the Page Detailer tool later). Note that the main HTML page, which I've selected in the left-hand pane, is 201,058 bytes long and took 1.242 seconds to download. Figure 3 shows a trace of the same page after I enabled compression, as you can tell by the vise icon that appears in the treeview. The main HTML page is much smaller, down about 165KB compared with the uncompressed version. The trace also shows that the download time for the main HTML page dropped from 1.242 seconds to 0.110 seconds.

One caveat about using gzip is that it adds to your Web servers' CPU utilization. Using compression involves a trade-off: To provide a better user experience for bandwidth-challenged end users, you require your Web servers to do a bit more work to serve up a page. To ensure that your system isn't getting bogged down performing compression, use Performance Monitor or a similar tool to monitor CPU utilization before and after you enable compression.

Caching in ASP.NET
If you use ASP.NET-based Web applications, you should become familiar with that technology's output-caching feature. ASP.NET provides several ways to cache dynamic pages, such as caching whole pages or page fragments. Output caching is useful for dynamic pages that don't change frequently (i.e., that contain basically the same data regardless of the user) but for which the page generation can be time-consuming. For example, if a page needs to talk to a back-end database to get data that changes only every day or even every hour, caching the results of the first request for that data can save subsequent trips to the database when another user asks for the same information.

You enable output caching within your ASP.NET pages by using page directives—instructions that tell IIS to do something on the server side before serving the page to the client. For example, at the beginning of an .aspx file that you want to cache, you might add the following directive:

<%@ OutputCache Duration="10"
   VaryByParam="none"%>

This directive tells ASP.NET to cache the page for 10 seconds and to cache all requests for the page once, regardless of any query strings the user passes with the page request. You can also use the VaryByParam attribute to cache based on the query string that the request passes. For example, suppose a user passes a query string requesting the page to return a set of sales-related database records for a particular geographic region. The page request might look something like

http://www.mycompany.tld/
   getsales.aspx?region=west

Now, suppose you want to cache all page requests for 30 seconds per sales region. You could do so by using the page directive

<%@ OutputCache Duration="30"
   VaryByParam="region"%>

Output caching provides significant flexibility for caching dynamic Web pages in ASP.NET. To learn more about the other variations of this technology, check out Rob Howard's Microsoft Developer Network (MSDN) Library column Nothin' but ASP.NET, "Page Output Caching, Part 1" (http://msdn.microsoft.com/library/en-us/dnaspnet/html/asp03282002.asp?frame=true) and the output-caching discussion in the Microsoft .NET Framework software development kit (SDK) documentation (http://msdn.microsoft.com/library/en-us/cpguide/html/cpconcachingportionsofaspnetpage.asp?frame=true).

Performance-Measuring Tools and Services
Most Web site performance-measuring tools are fairly quantitative in nature and can't really measure the elusive customer-experience aspect of site performance. However, if you want to compare apples to apples between different sites or measure the differences in performance between two technological approaches, these tools and services fill the bill.

In the Microsoft world, the performance-measuring tool to start with is the Microsoft Web Application Stress (WAS) tool, which is available for download at http://webtool.rte.microsoft.com. WAS is actually a load-testing tool, but it can give you a measure of a site's performance. I find WAS to be fairly limited compared with a full-featured Web-load­testing tool such as Mercury Interactive's LoadRunner, but WAS, of course, is free.

Another performance tool that I highly recommend is IBM's Page Detailer utility. Although Page Detailer is part of IBM WebSphere Studio Professional Edition, it can be an extremely useful tool for administrators (e.g., those in heterogeneous shops) who have access to it. You can find an evaluation copy of Page Detailer at http://www.research.ibm.com/pagedetailer. Figure 2 shows an example of the kind of output that Page Detailer generates. The left-hand pane lists each object on a page and the object's size and download time. The right-hand pane presents a Gantt-style chart showing how the browser downloads each object with respect to the others. You can right-click each object on the left and choose Properties to view object-related data such as the HTTP header data and socket information. This information reveals object size, whether objects are being cached, and whether your browsers are efficiently using TCP connections.

Page Detailer can't show you detailed information about objects on a page that's SSL encrypted. This apparent limitation makes sense because you shouldn't be able to view a page's encrypted contents outside of the browser. However, another tool that I rely on gives you this capability. Systems Software Technology's (SST's) TracePlus/Web Detective provides the same size, header, and connection information for SSL-encrypted files that Page Detailer does for unencrypted files, but Web Detective's information is quite a bit more detailed. Web Detective also lets you trace SSL traffic—a useful feature that, unfortunately, works only with IE browsers.

In addition to tools that you can run yourself, several performance service providers available on the Internet can monitor your Web site's performance on an ongoing basis and help you manage service level agreements (SLAs) for your site. The granddaddy of these providers is Keynote Systems. Keynote was in its heyday during the dot-com explosion and continues to provide valuable remote measurement services for sites today. Keynote has installed agents around the United States and internationally that can measure the performance of pages or transactions (which correspond to a series of pages) on your site from a number of geographic locations, thus letting you understand the level of network coverage you provide for your users. Other companies that provide similar services include Mercury Interactive and a Mercury Interactive subsidiary called Freshwater Software.

Performance-Improvement Services
If you measure your site's performance and find it lacking after you've done everything you can to improve its design, you might be ready to get some professional help. Several third-party vendors provide solutions to optimize content delivery. Chief among these are the Content Distribution Network (CDN) providers such as Akamai Technologies. Akamai can improve your Web site's performance by distributing some or all of your Web pages to thousands of servers located close to the Internet's "edge" (i.e., near the user), thus minimizing the network latency that an end user experiences when visiting your site. Typically, vendors like Akamai are in the business of distributing static content and objects such as GIF and JPEG objects rather than dynamic content, but Akamai now delivers dynamic content as well. For more information about CDNs, see Tao Zhou's Windows & .NET Magazine article "Speedy Web Content Delivery with CDNs," http://www.winnetmag.com, InstantDoc ID 21741.

Other companies, such as FineGround and Pivia, also offer solutions for optimizing dynamic-content delivery. FineGround's solution, FineGround Condenser Acceleration Suite 4.0, is software that sits in front of your Web server infrastructure and delivers dynamic pages to users by delivering only the deltas between visits. As a result, you don't need as much bandwidth to serve the content as you would without FineGround Condenser Acceleration Suite. And, because users' browsers download fewer bytes on subsequent visits, they display pages more quickly. Other approaches include those from companies such as BoostWorks, which seeks to improve end-user performance by intelligently compressing your content on the fly as it leaves your Web server infrastructure. BoostWorks then delivers the compressed content to the browser, which uses standard browser technology, such as gzip, to uncompress the content.

Many Ways to Improve Performance
Although quantifiable measurements of customer experiences are difficult to obtain, you can be sure that if your pages take more than 8 seconds to appear, your Web site will lose eyeballs and possibly business. You can't do anything to make DSL or cable modems available to all your Web site's end users, but you have several ways—whether through technology built into your Web servers or through third-party tools or services—to improve the experience of users who surf your site over slow connections.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish