Skip navigation
SQL Server 2014's Analysis, Migrate, and Report Tool

SQL Server 2014's Analysis, Migrate, and Report Tool

Determine which tables and stored procedures would benefit from In-Memory OLTP

With SQL Server 2014's new In-Memory OLTP engine, you can load tables and stored procedures in memory, which provides very fast response times. The goal isn't to load all the database tables and stored procedures in memory, but rather just those tables that are crucial to performance and those stored procedures that have complex logical calculations.

Related: Rev Up Application Performance with the In-Memory OLTP Engine

To help you identify which tables and stored procedures will give you the best performance gain after being migrated to In-Memory OLTP, SQL Server 2014 provides the new Analysis, Migrate, and Report (AMR) tool. Built into SQL Server Management Studio (SSMS), the AMR tool consists of the:

  • Transaction performance collector (which collects data about existing tables and stored procedures in order to analyze workloads) and transaction performance analysis reports (which gives recommendations about the tables and stored procedures to migrate to In-Memory OLTP based on the collected data)
  • Memory Optimization Advisor (which guides you through the process of migrating a table to a memory-optimized table)
  • Native Compilation Advisor (which helps you identify T-SQL elements that need to be changed before migrating a stored procedure to a natively compiled stored procedure)

The AMR tool leverages the new Transaction Performance Collection Sets for gathering information about workloads and the Management Data Warehouse (a relational database) to store the collected data. The Transaction Performance Collection Sets includes the:

  • Stored Procedure Usage Analysis collection set (which captures information about stored procedures for a future migration to natively compiled stored procedures)
  • Table Usage Analysis collection set (which captures information about disk-based tables for a future migration to memory optimized tables)

Before you can use the AMR tool, you need to configure the Management Data Warehouse and the data collection process. After showing you how to do so, I'll demonstrate how to run the transaction performance analysis reports and how to use the two advisors.

Configuring the Management Data Warehouse

To configure the Management Data Warehouse, go to Object Explorer in SSMS. Expand the Management folder, right-click Data Collection, select Tasks, and click Configure Management Data Warehouse. This will launch the Configure Management Data Warehouse Wizard.

After the Welcome page, you'll find yourself on the Select Configuration Task page. On this page, select the option to configure a Management Data Warehouse.

On the Configure Management Data Warehouse Storage page, you need to specify the name of database that will host the Management Data Warehouse and the name of the server on which that database resides. If you need to create the database, click the New button to create one.

On the Map Logins and Users page, you'll find the existing logins allowed for the server that will host the Management Data Warehouse. If needed, you can edit the logins or map users to the administrator, reader, and writer roles for the Management Data Warehouse.

On the Complete the Wizard page, you need to verify the Management Data Warehouse configuration. If it's OK, click Finish. When the configuration of the Management Data Warehouse has successfully completed, you should see a page like that in Figure 1.

Verifying Configuration of the Management Data Warehouse
Figure 1: Verifying Configuration of the Management Data Warehouse

The Management Data Warehouse setup is now finished.

Configuring the Data Collection Process

To configure the data collection process, go to Object Explorer in SSMS. Expand the Management folder, right-click Data Collection, select Tasks, and click Configure Data Collection. This will launch the Configure Data Collection Wizard.

After the Welcome page, you'll find the Setup Data Collection Sets page shown in Figure 2. Besides needing to specify the server and database that will host the Management Data Warehouse, you need to specify the data collector sets. In the list of collection sets, select the Transaction Performance Collection Sets check box so that the data collector will collect statistics for transaction performance issues.

Specifying the Data Collector Sets
Figure 2: Specifying the Data Collector Sets

If the Management Data Warehouse is located on a different SQL Server instance from the data collector and if SQL Server Agent isn't running under a domain account that has dc_admin permissions on the remote instance, you have to use a SQL Server Agent proxy. If that's the case, be sure to select the Use a SQL Server Agent proxy for remote uploads check box.

Once you're done configuring the Setup Data Collection Sets page, click Finish. When the wizard completes the configuration, you'll have an enabled data collection process that will collect information about all user databases. Note that SQL Server Agent must be started on the instance that will collect the data.

In the SQL Server Agent's Jobs folder, you'll see the jobs used to collect data from your workloads and the jobs used to upload the collected data into the Management Data Warehouse. The data collection jobs use the naming convention collection_set_N_collection, where N is a number. The upload jobs use the naming convention collection_set_N_upload, where N is a number.

By default, the AMR tool collects data from three dynamic management views every 15 minutes for both the Stored Procedure Usage Analysis and Table Usage Analysis collection sets. The upload job runs every 30 minutes for the Stored Procedure Usage Analysis collection set and every 15 minutes for the Table Usage Analysis collection set. If you want to speed your upload, you can execute these jobs manually. Uploading the data has a minimal impact on performance.

Running the Transaction Performance Analysis Reports

To access the recommendations based on the information collected about all your user databases on the workload server, you need to run the transaction performance analysis reports. To access them, right-click your Management Data Warehouse database, select Reports, choose Management Data Warehouse, and click Transaction Performance Analysis. From the Transaction Performance Analysis Overview page, you can choose to run three reports, depending on what type of information you need:

  • Recommended Tables Based on Usage
  • Recommended Tables Based on Contention
  • Recommended Stored Procedures Based on Usage

Recommended Tables Based on Usage. This report tells you which tables are the best candidates for migration to In-Memory OLTP based on their usage. Figure 3 shows a sample report. On the left side, you can select the database and how many tables you'd like to see from that database. The chart will then show the selected tables. The horizontal axis represents the amount of work needed to migrate a table to In-Memory OLTP. The vertical axis represents the gains you'll achieve after migrating the table. The best candidates for In-Memory OLTP are located in the top right corner. As you can see, they can be easily migrated and will give you the best performance gain.

Determining Which Tables Are the Best Candidates for Migration Based on Usage
Figure 3: Determining Which Tables Are the Best Candidates for Migration Based on Usage

You can access a detailed report for a table by clicking its name in the chart. As Figure 4 shows, this report provides the table's access statistics (e.g., lookups, range scan) and contention statistics (e.g., latches, locks), as well as when this information was captured.

Reviewing the Detailed Performance Statistics for a Table
Figure 4: Reviewing the Detailed Performance Statistics for a Table

Recommended Tables Based on Contention. This report tells you which tables are the best candidates for migration to In-Memory OLTP based on their contention. If you compare the contention analysis report in Figure 5 with the usage analysis report in Figure 3, you'll see that they're very similar.

Determining Which Tables Are the Best Candidates for Migration Based on Contention
Figure 5: Determining Which Tables Are the Best Candidates for Migration Based on Contention

You can select the database and how many tables you'd like to see from that database. The resulting chart shows the amount of work needed to migrate the tables (horizontal axis) and the gains you'll achieve after migrating them (vertical axis). In the top right corner, you'll find the best candidates for migration based on contention. You can click a table name in the chart to access a detailed report showing the table's statistics. This report provides the table's access and contention statistics.

Recommended Stored Procedures Based on Usage. This report shows you which stored procedures are the top candidates for an In-Memory OLTP migration based on their usage (i.e., total CPU time). After selecting the database and how many stored procedures you'd like to see from that database, the resulting chart shows the top candidates for migration, as Figure 6 shows.

Seeing Which Stored Procedures Are the Top Candidates for Migration Based on Usage
Figure 6: Seeing Which Stored Procedures Are the Top Candidates for Migration Based on Usage

If you want to see the detailed usage statistics for a specific stored procedure, you can click its blue bar. Figure 7 shows an example of the report you'll receive.

Reviewing the Detailed Usage Statistics for a Stored Procedure
Figure 7: Reviewing the Detailed Usage Statistics for a Stored Procedure

Using the Memory Optimization Advisor

After you know which tables you want to migrate to In-Memory OLTP, you can use the AMR tool's Memory Optimization Advisor to help you with the migration process. To access this advisor, open Object Explorer in SSMS and navigate to the table you want to migrate. Right-click the table and choose Memory Optimization Advisor.

The advisor will launch with the Introduction page, which you can read or skip. Clicking Next brings you to the Migration Optimization Checklist page, where the advisor will check to see if your table can be migrated. If one or more validation items fail, the migration process will stop. If needed, you can generate a report for this analysis. If all you see are green checkmarks, your table doesn't have any features that could prevent the migration process, in which case you can proceed to the next page.

On the Migration Optimization Warnings page, you'll find important information about what isn't supported in memory-optimized tables and other types of issues. The issues listed won't prevent the table from being migrated, but they might cause other objects to fail or behave in an unexpected manner.

If a warning applies to the table you selected for migration, an exclamation point in a yellow triangle will appear next to the warning, as shown in Figure 8.

Reviewing the Migration Optimization Warnings
Figure 8: Reviewing the Migration Optimization Warnings

In this case, the selected table has an unsupported French_CI_AS collation on the indexed column named Person_OnDisk_Name. (Only BIN2 collations are supported for indexes in memory-optimized tables.) Thus, the index collation will need to be changed later in the migration process.

Reviewing the Optimization Options
Figure 9: Reviewing the Optimization Options

On the Review Optimization Options page, which Figure 9 shows, you have the option to change the defaults listed for the:

  • Name of memory-optimized file group (only one memory-optimized file group is allowed per instance)
  • Logical filename
  • Path where the logical file will be saved
  • New name given to the original table (the original table is renamed to prevent naming conflicts)

You can also choose to copy data from the original table to the new memory-optimized table during the migration process, and you can change the durability of the memory-optimized table. By default, its DURABILITY option will be set to schema_and_data, but you can change it to schema_only by selecting the Check this box to migrate this table to a memory-optimized table with no data durability option. If you do so, the data will be lost after the SQL Server service is restarted. In other words, just the table's schema is persistent.

Finally, the Review Optimization Options page shows the estimated current memory cost for the memory-optimized table. If there isn't sufficient memory, the migration process might fail.

Once you're done with the Review Optimization Options page, you can click Next to go to the Review Primary Key Conversion page. When the migration process begins, it will start by converting the primary key. You can convert it to:

  • A nonclustered hash index, which gives the best performance for point lookups. If you select this option, you also need to specify the bucket count, which should be twice the expected number of rows.
  • A nonclustered index, which gives the best performance for range predicates.

For each index you have in the table being migrated, you'll be presented with a Review Index Conversion page that has been populated with the columns and data types for that index. The options you can configure in the Review Index Conversion page are similar to those in the Review Primary Key Conversion page. In this case, for the indexed column Person_OnDisk_Name with the unsupported French_CI_AS collation, you'd have to select BIN2 collation as the Char data type.

On the Verify Migration Actions page, you'll see all operations that will be performed to migrate your table to In-Memory OLTP. You have the option to script those operations by clicking the Script button. After verifying all the options, you can click the Migrate button to start the migration process.

Figure 10 shows how the new memory-optimized table appears in SSMS. If you view its properties, you'll see that the Memory optimized property is set to True and that the schema and data are durable for this table.

New Memory-Optimized Table in SSMS
Figure 10: New Memory-Optimized Table in SSMS

In Figure 10, you can also see how the original table has been renamed.

Using the Native Compilation Advisor

After you know which stored procedures you want to migrate to In-Memory OLTP, you can use the AMR tool's Native Compilation Advisor to help you with their migration. To access this advisor, open Object Explorer in SSMS and navigate to the stored procedure you want to migrate. Right-click the stored procedure and choose Native Compilation Advisor.

After clicking through the Welcome page, you'll be presented with the Stored Procedure Validation page, which will give you warnings if your stored procedure contains some T-SQL elements that aren't supported by native compilation. If the stored procedure is valid, it can become a natively compiled stored procedure without modification. However, the Native Compilation Advisor doesn't migrate stored procedures like the Memory Optimization Advisor migrates tables. You have to do the migration on your own.

If the stored procedure has unsupported T-SQL elements, the validation will fail. To see the details about the unsupported elements, you need to click Next to go to the Stored Procedure Validation Result page, which Figure 11 shows.

Reviewing the Unsupported T-SQL Elements
Figure 11: Reviewing the Unsupported T-SQL Elements

You have to modify the unsupported elements before you can migrate your stored procedure to a natively compiled stored procedure.

Eliminate the Guesswork

The AMR tool is useful because it eliminates the guesswork in determining which tables and stored procedures would benefit from In-Memory OLTP. After identifying which tables to migrate, you can use the Memory Optimization Advisor to quickly migrate them. Although the Native Compilation Advisor can help you identify the T-SQL elements you need to change before migrating your stored procedure to a natively compiled one, it unfortunately doesn't guide you through the migration process.

*************************************************************************************

Stéphane SavorgnanoStéphane Savorgnano is a consultant at dbi Services in Switzerland. He's an MCSA for SQL Server 2012 and an MCITP for SQL Server 2008. He has more than 15 years of experience in Microsoft software development and in SQL Server database solutions. He holds a master's degree in Informatics from Mulhouse University.

*************************************************************************************

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish