I'm trying to create a baseline of SQL Server performance by using information from the Windows 2000 Performance Monitor. Performance Monitor's default refresh rate for sampling information is 15 seconds. Would lowering this default value and sampling more often create an excessive burden on the server?
Answers to performance-tuning questions depend on circumstances, but I almost always use a 1-second sampling interval when running Performance Monitor. Let's see why I normally use a 1-second interval, then explore if using a higher sampling interval ever makes sense.
You need to consider two fundamental truths when discussing performance tuning. The first is related to the Heisenberg Uncertainty Principle. (My last formal physics class was about 13 years ago, so please excuse the scientific butchering that's about to take place.) The Heisenberg Uncertainty Principle is a rule generally applied to the study of quantum mechanics. In simple terms, the rule says that you can't observe the behavior of something without affecting its behavior. Performance tuning isn't quantum mechanics, but it does conform to this principle in that the simple act of watching the performance of a system affects the system's performance. The impact of watching a system increases as you capture more detailed and thus more helpful information. The trick is to ensure that you understand the impact that you're having and that you make sure the impact is reasonable and isn't excessively skewing your results.
Setting Performance Monitor's sampling threshold to 1 second will take a heavier toll on the system than using the default, but you'll find that this overhead will rarely have a large impact on the results you're capturing. Performance Monitor isn't a very intrusive application.
The second fundamental truth of performance tuning is that it's hard to fix problems you don't see. If you don't have solid metrics to help you battle the performance gremlin that's been bedeviling your users for weeks, you're in trouble. Unless you have accurate and specific performance numbers to aid in your sleuthing, you're mostly guessing.
If you leave the sampling threshold at 15 seconds, you'll likely miss your system's most interesting and important performance characteristics. Computer systems, especially databases, rarely behave in a predictable linear manner. If you miss some problems during this interval, you won't be able to effectively tune performance, and you'll end up with misleading baseline statistics. Typical systems exhibit a wide range of performance characteristics over a 15-second period.