Skip navigation

Don't Assume It Works: Testing Your Application Before Deployment

I remember taking an Advanced Data Structures class in college, and during the last week of the semester, the computers got so busy that it could take 30 minutes or more just for a program to compile. (This was back in the dark ages when we all connected using "dumb" terminals to the same computer.) I also remember some of my classmates eagerly awaiting the completion of a compilation, saying they had only had a few errors on the last compilation and they were sure they were gone now. The implication was that as soon as the program compiled, they were practically done. It made me wonder if they did any testing at all. Did they stop at the successful compilation and just assume it worked?

I was well aware of the fact that a compiled program didn't necessary mean a working program. I still tested my program to make sure it would return the expected results and handle error conditions gracefully. Way back then, we didn't even consider how fast the compilation performed, as long as we got the right answer. But without testing their compilations, my classmates would have no guarantee of anything approaching correctness.

Now fast-forward 15 years or so. I had just become an independent consultant after leaving Sybase, and one Monday morning I was called out to Orlando for an emergency tuning job on a Sybase application that had gone live that morning. (Orlando was three hours ahead of my time zone.) I was on a plane that afternoon and onsite at 8AM ET Tuesday morning. The only information I had was that the application had been thoroughly tested but didn't work when deployed into production. It didn't take me long to determine that the developers' definition of "thoroughly tested" was very different than mine. Apparently, they had verified that all the application screens worked perfectly and returned the correct results in the correct boxes, but they hadn't tested the application with more than one operation being performed at a time. So I got my first trip to Disney World because of their lack of a testing protocol.

The main problem they encountered because of a lack of multiuser testing was blocking. They hadn't even considered what would happen when two different clients were trying to update the same data concurrently. (And of course, after deployment, they had situations in which a lot more than two clients were trying to access the same data at the same time.) The second problem was that they had no idea if their physical resources could handle the expected load. Could their disk system and memory cache handle all the data access they were expecting? Did their network have the bandwidth to deal with the client/server communication of the expected number of users? They had no idea because they had never done any testing beyond making sure the application "worked," and for them that meant that it seemed to be returning the right data when one user was running the application.

Software testing is a huge field, and I'm not going to pretend to be an expert in it. But at least I know it needs to be done. A quick search on Amazon.com returned almost 50 books dealing with just software testing. There are tools available for testing your applications, but I know many people who avoid multiuser testing because the commercial load testing tools are so expensive. The commercial load testing tools for SQL Server let you to submit multiple types of queries, specify how many times each should be run, and what the delay should be between subsequent executions. They also keep all kinds of statistics so that you can analyze the results of your tests. These tools are very nice to have, but if you can't afford them, it doesn't mean you should forgo testing altogether.

How much does it cost to have your whole development team try pounding on your application at once? It might not be the number of users and the load you'll have in production, but at least it's better than doing no load testing at all. There are also free tools available to help you run multiple queries simultaneously to test concurrency or to generate a load on the hardware resources to help determine how much your system can handle.  Here are two free testing tools that just came to my attention this week.

The first free tool, SQL Load Generator, was written by a program manager at Microsoft to help him generate a load on his system so he could demonstrate the use of SQL Server 2008’s Resource Governor. You can find his tool at http://www.codeplex.com/SqlLoadGenerator.  The second free tool is a query load testing tool called SQLQueryStress, which was written by SQL Server MVP Adam Machanic. You can access SQLQueryStress through his blog at http://sqlblog.com/blogs/adam_machanic/archive/2007/06/28/new-version-of-sqlquerystress-released.aspx.

So even if you can't afford full-featured load testing and analysis tools, you can still perform your own load testing. The following are my testing 1, 2 ,3 recommendations:
1. Test with real data.
2. Test with real data volumes.
3. Test with a real number of concurrent users.

Of course, if you choose not to test your applications, you better make sure you're located somewhere a tuning consultant might want to pay a visit.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish