Isolating Data Transformation Services (DTS) problems can be difficult when you don't know what occurred during a DTS package's execution. Fortunately, DTS includes built-in package-logging capabilities that provide a valuable source of information for troubleshooting and monitoring package execution. With SQL Server 2000, DTS provides two methods for capturing log information. The first method—and the only method that SQL Server 7.0 supports—logs errors and execution information to a text file. The second method, introduced in SQL Server 2000, logs errors to a database. Both methods provide essentially the same runtime information. However, as their descriptions indicate, their output format differs and, consequently, so does how you manage them. In this article, I explore the text-file—based method for capturing DTS package log information, which applies to both SQL Server 2000 and 7.0. Let's look at the information that the DTS package log contains, examine the characteristics of the logging process, then walk through a method for managing text-file—based DTS logs that uses VBScript code with a scheduled SQL Server Agent job.
Gathering Essential Information
To enable text-file—based package logging, you must provide an Error file filename on the Logging tab in the DTS Package Properties dialog box, which Figure 1 shows. The name Error file is misleading because text-file—based logging captures package logging information even when no error has occurred.
The DTS package log provides summary information about package execution as well as information about each step in the package. Figure 2 shows a sample DTS package log. The log's first section provides summary execution information about the entire package. Of the available metrics that the log captures, the ones you'll refer to most frequently include Executed On, Executed By, and several metrics related to execution time. The Executed On entry identifies the physical machine on which the package executed. Knowing where the package executed is crucial when you're trying to troubleshoot portability-related runtime errors, especially when you're moving a package from one server to another. (For more information about DTS package portability, see "DTS on the Move," June 2003, InstantDoc ID 38658.) The next entry, Executed By, identifies which user or process initiated the package. This information is important when you're diagnosing security-related failures because, as "DTS on the Move" explains, DTS packages assume the security context of the user who initiates the package.
The next few entries provide metrics about the package's start time, completion time, and total runtime. These metrics let you capture performance information about the package. Over time, you can use these performance metrics to create a baseline for measuring the effects of changes in data volumes or processing rules.
The remainder of the log provides execution information about each of the package's steps, including the completion status; the start, finish, and runtime in seconds for each step; and the progress count. Of these, completion status and execution times are usually the most valuable metrics for isolating package-execution problems. Completion status lets you quickly identify which steps succeeded, failed, or weren't even executed, whereas the execution times provide a way to identify performance bottlenecks. The progress count tracks the number of records imported for tasks that process data (e.g., the Transform Data Task).
Managing the Log
You need to be aware of some quirks when you use DTS package logs. First, DTS doesn't generate a package log until all the steps in the package are complete. Second, if the log file doesn't exist for a package, DTS creates a new one; but if a log file already exists, DTS appends the new data to the existing file for the package.
These quirks pose two logistical problems. First, because DTS doesn't generate log files until all steps in the package are complete, managing DTS logs from within the package is difficult. Second, because DTS appends data to existing log files, the files can become large, resulting in possible disk-space problems and, more importantly, usability problems. For example, as the file size grows, locating information becomes harder. To get around these problems, I devised a method for managing the log files.
The log-management plan I describe uses one scheduled SQL Server Agent job to execute the VBScript code that Listing 1 shows. The script first consolidates all files from the log-file directory into one master log file in the merge-file directory. Then, the script cleans up the master log file by deleting the old log files and sends an email notification signaling that it has successfully completed. Let's look at each section of the script in more detail.
At callout A in Listing 1, the code declares the local variables and object variables that the script references later. The code at callout B creates the master log file and opens it for appending. To make the master log filename unique, the script concatenates the current system date and time to the filename. By using a unique filename, you can run the log-management script multiple times throughout the day without overwriting any existing files. Note that you need to customize the Universal Naming Convention (UNC) directory reference \\SERVERNAME\SHARE, which is stored in both the dirLogFileDir and dirMergeDir variables, to fit your environment.
At callout C, the script iterates through each file in the package log file directory, reading any available files and appending them to the master log file. If no package log files exist, the script creates a master log file with one entry stating that no package log files were found. After appending the package log files to the master file, the script deletes the package log files. So the next time the package is executed, DTS will automatically create a new log file.
At callout D, the log-management script performs some file cleanup by deleting any master log files that are older than 7 days. This cleanup process minimizes the number of master log files and eliminates the need to manually delete old master logs. At callout E, the code uses the CDONTS API to send an email message notifying you that the merge script completed successfully. CDONTS is the object model that the Microsoft IIS SMTP service uses to send mail; it's automatically installed on Windows 2000 Servers running IIS 5.0 and the SMTP service. In addition to customizing the UNC path for the log files, you also need to customize the email addresses stored in the objMail object variable to match your environment.
Finally, at callout F, the code destroys all object variables that the script created. It's important to remember that when you're using an ActiveX script within a SQL Server Agent job, you must explicitly destroy all object variables by setting their values to nothing; otherwise, you can't save the job.
After you customize the log-management script to fit your environment, the last step is to create the SQL Server Agent job to execute the script. By coordinating the log-management script's schedule with the schedule of the DTS packages that are creating the logs, you can execute the packages multiple times throughout the day but only generate one log file per day. In addition, you can configure the SQL Server Agent job to send an email notification if the job fails. This notification, combined with the email notification built into the log-management script, provides you with an excellent method for monitoring the script's execution.
DTS log files can provide crucial information when you're troubleshooting DTS package errors and performance problems. I hope this article encourages you to use DTS package logging in your troubleshooting efforts and gives you some ideas for managing the size and number of your log files.