Skip navigation
Performance tuning

Three Options for SQL Server Performance Tuning

With the increase in the use of the cloud--and the cloud's cost structure--SQL Server performance tuning is more important than ever.

Database performance tuning may not be database pros' favorite task, but leaving performance to chance will cost your company more time and money (and, possibly, customers) than ever before. 

Performance tuning is a task that most database professionals, developers and administrators alike find themselves absorbed in on a regular basis. One of my clients, for example, recently asked me to  look into high CPU use in its AWS EC2 hosted SQL Server environment.

For those not familiar with the terminology, an EC2 instance is a dedicated virtual machine hosted in Amazon’s AWS Cloud. EC2 instances come in various compute sizes, and you pay based upon the amount of CPU and RAM that is provisioned (in addition to other costs, of course). Most cloud providers use similar cost structures, where compute and memory are the driving factors for what they charge to provision assets in their environments.

Now that cloud solutions providing SQL as a service are becoming more prevalent, it’s more important than ever to focus on tuning your SQL Servers for optimum performance. Why? Because you pay for your poorly performing code and the effects it has in your databases in terms of real dollars. And that's in addition to the licensing costs already associated with per-core licensing and the possible churn in your customer base if your performance degrades to the point where customers leave for better-performing experiences offered by your competition. 

There is more than one solution to troubleshooting high-CPU queries and stored procedures in Microsoft SQL Server and ways of looking at iterative improvements in your tuning processes. These include:

  • Dynamic Management Views
  • Time Statistics
  • SQL Server Management Studio Client Statistics

Let’s look at each one of these possible options so you can choose the best fit for your tuning tasks. We will do so based upon a test case I’ve set up inside my demo environment.  I’ve created a very simple stored procedure in a copy of the SQL Server sample database known as Wide World Importers:

CREATE PROCEDURE pOption1 AS

BEGIN
       SELECT RecordedWhen, A.Temperature
       FROM dbo.Temps AS A
       WHERE A.Temperature < dbo.OptimalValue(ColdRoomSensorNumber)
              AND A.ColdRoomSensorNumber = 1
END
GO

I wanted something that would be poorly performing, so I added a user-defined function as a predicate inside the stored procedure. In each of the options below I’ll show you how you can collect an initial baseline for a stored procedure you’re preparing to tune inside your test environment so you can measure improvements over time throughout your tuning process. For the sake of this article, I’m assuming you have already identified a stored procedure that requires tuning. If you’re looking for insight into how to identify such stored procedures I recommend an earlier article of mine here.

Option One: Dynamic Management Views

There are two Dynamic Management Views (DMVs) that are relevant when looking at CPU and duration for SQL Server queries: sys.dm_exec_query_stats and sys.dm_exec_procedure_stats. The former provides insight into ad-hoc queries, while the latter provides details on stored procedures that have been executed. Other than those differences, the DMVs are quite similar in design and results. They each return execution information that pertains to the min/max/average and last metrics for everything from IO (reads and writes), CPU and duration. Since the example I’m using is a stored procedure I’ll be querying results from dm_exec_procedue_stats. The following query provides information on the last time the stored procedure was executed:

 SELECT OBJECT_NAME(ePS.object_id,ePS.database_id)
       , ePS.execution_count, ePS.last_worker_time / 1000.0 AS last_worker_time_ms
       , ePS.last_elapsed_time / 1000.0 AS last_elapsed_time_ms
FROM sys.dm_exec_procedure_stats AS ePS
WHERE OBJECT_NAME(ePS.object_id, ePS.database_id) = 'pOption1';

I’ve converted last_worker_time and last_elapsed_time into milliseconds from their collected microseconds value to make it comparative to the other two options in this article. Milliseconds is generally a more common unit of measure for execution and CPU time. The results look like this:

 

What I like about using the DMVs is that there is so much information available within each one, and you can easily join DMVs together to yield even more details than you can from the next option, the display of time statistics.

This information is also already cached, so you don’t need to execute the query each time in order to get information--so long as you’ve executed the stored procedure you’re reviewing at least once. Modifying this query to include information relating to the average worker time (CPU time) and average elapsed time (the amount of time composed of not just CPU time but also the time spent waiting to get onto the processor through the scheduling process) is as easy as dividing by the execution count.

This is the only option out of the three that is also a good diagnostic for making the initial assessment of which procedures are good candidates for tuning: You can build queries against sys.dm_exec_procedure_stats and run them in your production environment without having to run the procedures yourself--either in production or in your testing environments. You can see the cached behavior of the executions because the metadata that “feeds” these DMVs is cached over time. 

Option Two: Time Statistics

The display of time statistics is enabled/disabled inside of SQL Server Management Studio (SSMS) at a session level by executing the following command:

SET STATISTICS IO (ON|OFF)
The information is displayed on the Messages Tab that is generated when a query is executed in SSMS:
SET STATISTICS TIME ON 
EXEC dbo.pOption1;

There are some drawbacks to relying on STATISTICS TIME. The results are only calculated at execution time. You also need to take manual notes and calculations of the iterative impact of tuning every time you make a change. The final option does a great job of overcoming the need to keep manually calculating success or setbacks in the tuning process, however, and is a better alternative than STATISTICS TIME.

Option Three: Show Client Statistics

This is the most obscure option of the three, which is surprising because the button to enable this feature is right in front of anyone using SQL Server Management Studio:

You can also access this feature from the menu bar in SSMS under Query\Include Client Statistics or through the keystroke combination of Shift+Alt+S.

Any of these three enabling options will provide an additional tab each time a query is executed inside a query window in SSMS. Each query window is a distinct session, and the client statistics are distinct to each execution of any code in that specific query window. This is what the client statistics look like for the query window corresponding to the same query executed in each of the scenarios above:

 

There are a few things I want to draw your attention to in the example above. I’ve highlighted in yellow the same metrics around CPU time (that is, worker time) and execution time the other options provide. There are some similarities here to the use of the DMVs. The client statistics are not limited just to CPU time and duration. This makes client statistics an option for tuning more than just CPU.

You may wonder why the performance on Trial 2 is so different, and this is the issue I hinted at earlier: These statistics are for any code executed inside the current query window. Trial 1 and Trial 3 are associated with the EXEC call to pOption1, but Trial 2 was the result of code I ran to clean the buffer pool between executions. Be aware of this behavior when performing your tuning iterations. When tuning iteratively, it’s really nice to have the visual of the up and down arrows to see the improvements as you alter the code and re-execute.

Conclusion

Just as is the case with most tasks, iterative CPU tuning in SQL Server can be accomplished by using more than one tool. The key is finding the option that works best for you or for the situation you find yourself in at the time.

 

 

 

 

 

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish