SQL Server Magazine UPDATE—brought to you by SQL Server Magazine
THIS ISSUE SPONSORED BY
Living Doc Documentation Solutions
Download a FREE Patch Management Tool from Ecora
(below NEWS AND VIEWS)
SPONSOR: PREMIER BI&DW EVENT: REGISTER NOW & SAVE
TDWI World Conference, New Orleans, 2/9/03 - 2/14/03. Intensive, unbiased training for business intelligence and data warehousing professionals at every level. Complete brochure is
available now for download, visit:
December 12, 2002—In this issue:
- Add Value with Data Mining
2. SQL SERVER NEWS AND VIEWS
- First Yukon Beta Set for First Quarter 2003
- Results of Previous Instant Poll: SSL Encryption
- New Instant Poll: Debugging T-SQL
3. READER CHALLENGE
- December Reader Challenge Winners and January Challenge
- SSMU E-Learning Early Bird Discount and Freebie!
- Learn About SQL Server 2000 Memory Management
- What's New in SQL Server Magazine: Points of (Indexed) Views
- Hot Thread: Delaying DTS Workflow
- Tip: Extracting Data from Multiple Tables
6. HOT RELEASE (ADVERTISEMENT)
- Free SQL Tool from NetIQ
7. NEW AND IMPROVED
- Correct Security Vulnerabilities
- Restore SQL Server Databases
8. CONTACT US
See this section for a list of ways to contact us.
(contributed by Brian Moran, news editor, [email protected])
Have you seriously considered what data mining could do for your company? I've read SQL Server Books Online (BOL), walked through a few demos, and perused some Web sites about data mining, but I've never built an end-to-end data-mining solution. If you're like me, data mining sounds cool, but you're not exactly sure what it is—let alone what to do with it.
I think of data mining as one of two types of data analysis under the umbrella of business intelligence (BI). The most common type of BI today is online analytical processing (OLAP), which is a presentation and data-aggregation technology that lets you visualize and interact with your data in ways that you can't in traditional SQL reporting environments. An effective OLAP tool connected to a world-class OLAP cube lets you browse your data, drill down and around in flexible ways, and ask questions about what the data means. But an OLAP tool doesn't automatically find the valuable but hidden highlights in your data. You still need to know what you're looking for.
To me, the Holy Grail of data mining is the ability to discover information and patterns you didn't know existed in your data so that you can make better business decisions. In data mining, you set pattern-seeking algorithms loose on your data, and the algorithms do the work, bringing to light interesting and important relationships in your data. Once you know about those relationships, an OLAP tool can help you analyze them.
Sounds great! Unfortunately, data-mining technology in today's market tends to be too hard to use and too expensive for most companies. Most serious data-mining environments require users to have a firm foundation in advanced statistical techniques just to make heads or tails of the results. However, Microsoft has taken a different approach to data mining, simplifying the process and making it affordable for the masses.
With SQL Server 7.0 OLAP Services, Microsoft became the first top database vendor to include OLAP technology in the database system itself at no additional cost. In many ways, this approach to delivering data-analysis tools has been responsible for the growth in OLAP awareness over the past few years. Whether or not you use Microsoft OLAP tools, the availability of free OLAP functionality in SQL Server has motivated other vendors to offer more-manageable, more cost-competitive OLAP solutions. With SQL Server 2000, Microsoft added a set of integrated data-mining technologies. Although I don' know many people who are using this technology in production systems today, the release of data-mining technology in SQL Server 2000 Analysis Services was a watershed event—the first time a major vendor made a concerted effort to bring data mining to the masses.
SQL Server's data-mining technology still needs to be easier to use and more functional. I can't share details about new data-mining functionality planned for the Yukon release of SQL Server, but suffice it to say, the SQL Server team is working on some practical and cool enhancements. And a new project I'm working on will give me the chance to dive into real-world data-mining issues and to share those experiences with you.
In the meantime, I encourage you to start soaking up any available information about data mining. This technology won't change the nature of your business overnight, but it will have a profound affect on the way you interact with your data in the future. A great place to start is the 149-page guide to "Preparing and Mining Data with Microsoft SQL Server 2000 and Analysis Services, " available at http://msdn.microsoft.com/library/default.asp?url=/servers/books/sqlserver/mining.asp. This guide also offers a 35M download of sample code that you can dig in to. SQL Server's in-the-box data-mining functions provide some unique opportunities for database professionals to add significant value to customers and employers—and to add value to their own careers by being among the first people in the market to understand how to apply data mining to a wide range of needs.
LIVING DOC DOCUMENTATION SOLUTIONS
Create all your technical SQL/ASP/.NET documentation automatically with LIVING DOC from Living Address. Save time, money, and energy by skipping the painful manual documentation process with this market leading solution. Direct the Living Doc software to the desired location, and it will scan a project while running in the background. Living Doc is a documentation solution that automates the entire documentation process while showing dependencies within .NET, ASP, and SQL projects. It provides development teams with the ability to compare 'snapshots' or documentation scans to see what has been added, updated or deleted since the last scan.
2. SQL SERVER NEWS AND VIEWS
(contributed by Paul Thurrott, [email protected])
Microsoft is prepping the first beta release of SQL Server 2003 (code-named Yukon) for release in February or March and will use feedback from the beta to determine the final release schedule. The long-awaited Yukon code base will usher in a new era on Microsoft's software-development road map; the company will incorporate the software into other projects, including the Windows Future Storage (WinFS) file system in the next Windows release (code-named Longhorn) and the data store for a future Microsoft Exchange Server release (code-named Kodiak). Microsoft will also issue the next major release of Visual Studio .NET to coincide with Yukon; a minor release (code-named Everett) will ship in early 2003 to coincide with Windows .NET Server (Win.NET Server) 2003.
Microsoft will deliver Yukon beta 1 to only 1500 testers, although a future beta release will be more widely available. Alpha code is being used inhouse and with select close partners, the company said. A key advance in this release is the ability to code stored procedures in any Visual Studio .NET-compatible programming language, thanks to full compatibility with the Microsoft .NET Common Language Runtime (CLR). This feature will make SQL Server more accessible and useful for developers.
Before Microsoft finalizes Yukon, however, the company will ship the final version of SQL Server 2000 64-bit Edition (code-named Liberty). The company will finalize this project in time for the April 2003 release of Win.NET Server, which includes various 64-bit editions.
The voting has closed in SQL Server Magazine's nonscientific Instant Poll for the question, "Are you using Secure Sockets Layer (SSL) encryption to protect your data?" Here are the results (+/- 1 percent) from the 221 votes:
- 28% Yes - 25% No, but I plan to - 47% No, and I don't plan to
The next Instant Poll question is "What's your primary method of debugging T-SQL code?" Go to the SQL Server Magazine Web site and submit your vote for 1) Using the Query Analyzer 2000 debugger, 2) Using the Visual Studio T-SQL debugger, 3) Using SQL Server Profiler, 4) Using PRINT statements embedded in the code, or 5) By analyzing the results and rereading the code.
SPONSOR: DOWNLOAD A FREE PATCH MANAGEMENT TOOL FROM ECORA
Looking for a free tool to protect your business-critical IT infrastructure from security vulnerabilities that could result in downtime and lost revenue? Ecora's FREE PatchMeister tool automatically discovers and analyzes security patches on your servers and workstations through the most intuitive user interface on the market. PatchMeister operates from an administrative workstation, scanning multiple systems at once and providing sortable views for easy data manipulation. Download this FREE tool today.
3. READER CHALLENGE
(contributed by SQL Server MVP Umachandar Jayachandran, [email protected])
Congratulations to Corey Tracey, senior developer for SHIPER Project at Sutter Health in California, and Emmanuel Nanchen, IT analyst and programmer at Manpower HR SA in Geneva, Switzerland. Corey won first prize of $100 for the best solution to the December Reader Challenge, "Stalking the Statements." Emmanuel won second prize of $50. You can find a recap of the problem and the solution to the December Reader Challenge at http://www.sqlmag.com/articles/index.cfm?articleid=27258.
Now, test your SQL Server savvy in the January Reader Challenge, "Corrupted Characters" (below). Submit your solution in an email message to [email protected] by December 19. SQL Server MVP Umachandar Jayachandran, a SQL Server Magazine technical editor, will evaluate the responses. We'll announce the winners in an upcoming SQL Server Magazine UPDATE. The first-place winner will receive $100, and the second-place winner will receive $50.
Here's the challenge: Cindy is the systems architect for a company that specializes in data warehousing technology. She needs to automate the execution of several T-SQL script files that are in Unicode format. These scripts populate tables in the warehouse. For example, for reporting purposes, Cindy needs to populate a Country dimension table with data based on ISO 3166 codes and localization information. The following shows a sample script file:
CREATE TABLE #t ( c VARCHAR(30) NOT NULL ) INSERT INTO #t values( 'Tëst' ) SELECT c FROM #t DROP TABLE #t
Cindy is automating the execution of the scripts against several SQL Server 2000 databases by using the OSQL command-line utility. But when she uses OSQL, she notices that the extended characters in the scripts are corrupted upon import. Extended characters, such as the German umlaut (ä), aren't in the standard ASCII character set. Help Cindy identify what's causing the problem and provide her with an efficient solution for executing the scripts successfully, without data corruption or loss.
(brought to you by SQL Server Magazine and its partners)
Register for SQL Server Magazine University's Microsoft Certified Training Course #2072 or #2073 by December 31, and get $50 off. Plus, the first 25 registrants to any 2003 SSMU course get a free Transcender exam! For course details and registration information, mention Priority Code EI322LEB at
Join Kalen Delaney on Friday, February 7, 2003, for a 1-hour Web seminar, "SQL Server 2000 Data Storage: Memory Management," brought to you by SQL Server Magazine University and Solid Quality Learning. No need to leave your desk; set your browser and dial the phone! Click here for details and to register.
Devising how to process a query is the job of the SQL Server query optimizer. In an ideal world, given two different queries that perform the same task, the optimizer would generate the same execution plan—the optimal plan—for both. However, in many situations, the way you write a query can still dramatically affect the query's performance. Indexed views can significantly improve query performance, especially when you're aggregating data. In some situations, after creating and indexing views, you don't even need to change your original queries that refer to the base tables because the optimizer is smart enough to use those indexes. In "Points of (Indexed) Views," which appears in the December issue of SQL Server Magazine, Itzik Ben-Gan looks at some problems in which using indexed views yields improved performance. This article is available online at
Jimfm created a Data Transformation Services (DTS) package that exports data into an Excel spreadsheet, then emails the spreadsheet to him as an attachment. He uses workflow to create a table, open a connection, export the data into Excel, then email the spreadsheet. When he executes each task in the package one at a time in order, the process works perfectly. But if he runs the package as a whole, it fails on the email part, raising the error message "Error sending mail: MAP Error: Could not open the attached file." The email seems to launch before the Excel file is finished. How does Jimfm delay the workflow so that the email is sent a few seconds after the Excel file is created? Offer your advice and read other users' suggestions on the SQL Server Magazine forums at the following URL:
(contributed by Microsoft's SQL Server Development Team, [email protected])
Q. When I have to extract data from multiple tables, which approach is more efficient: creating one stored procedure that uses ADO's NextResultSet method to return all the result sets or creating a stored procedure for each result set?
A. The efficiency of these solutions depends on several factors. Creating one stored procedure that returns all the result sets is effective as long as you use NextRecordSet (the ADO Recordset method call to fetch the next set of results) to process the result sets quickly. Failure to process the results quickly can lead to extended locking on the server.
Creating a stored procedure for each result set works well because it permits code reuse. You can let other programs access the data by using each stored procedure as a standalone API without having to worry about coding the other programs to process result sets they might not want. You can still obtain multiple recordsets by combining the calls to each smaller stored procedure in a wrapper stored procedure that contains no logic other than passing parameters to each procedure and error handling. This solution also lets you use smaller stored procedures that compile faster if the need for recompilation arises. Because of the high cost of compilation, we recommend this modular approach.
Say you're building an application that's going to be used on a network with high network latency between the client and the server—for example, on a network with user sites connected over low bandwidth or multiple router hops. In that case, using the ADO Command object with bound parameters to call each stored procedure individually can be more efficient because binding parameters avoids the metadata chitchat that ADO engages in while it determines which data types are being returned from the server. We tend to use this technique when we're conducting performance benchmarks.
Send your technical questions to [email protected]
6. HOT RELEASE (ADVERTISEMENT)
Quickly and accurately identify and investigate specific SQL Server problems with NetIQ's diagnostic dashboard, SQLcheck. This FREE tool organizes and explains critical information about your database server hardware, its operating system and SQL Server. Download SQLcheck now!
7. NEW AND IMPROVED
(contributed by Carolyn Mader, [email protected])
Configuresoft announced Security Update Manager (SUM) 2.0, a module for Enterprise Configuration Manager (ECM) that lets you quickly correct security vulnerabilities on Windows servers and workstations. SUM identifies vulnerable machines and installs patches on target machines. You can also create custom templates to test a group of machines for vulnerabilities. You can create customized user roles for SQL Server, security, or Microsoft Exchange administrators to selectively control access to the product's functionality and configuration data. Pricing for SUM 2.0 starts at $25 per server and $5 per workstation. Pricing for ECM starts at $995 per server and $30 per workstation. SUM 2.0 supports SQL Server 7.0 and later. Contact Configuresoft at 719-447-4600.
Prentice Hall released "SQL Server Backup and Recovery: Tools and Techniques," a book by Frank McBath that details how to protect and restore SQL Server databases. The step-by-step reference covers SQL Server 2000 and gives guidance for Windows XP and Windows 2000 environments. The book also contains scripts and sample code. The book explains how to plan for backup and recovery, including how to determine your time windows, and walks through recovery scenarios and backup strategies. The guide also teaches you how to recover the master database and how to properly use SQL Server's Database Consistency Checker (DBCC) commands. The 320-page book costs $49.99. Contact Prentice Hall at 800-282-0693.
8. CONTACT US
Here's how to reach us with your comments and questions:
- ABOUT THE COMMENTARY — [email protected]
- ABOUT THE NEWSLETTER IN GENERAL — [email protected]
(please mention the newsletter name in the subject line)
- TECHNICAL QUESTIONS — http://www.sqlmag.com/forums
- PRODUCT NEWS — [email protected]
- QUESTIONS ABOUT YOUR SQL SERVER MAGAZINE UPDATE SUBSCRIPTION?
Customer Support — [email protected]
- WANT TO SPONSOR SQL SERVER MAGAZINE UPDATE?
More than 102,000 people read SQL Server Magazine UPDATE every week. Shouldn't they read your marketing message, too? To advertise in SQL Server Magazine UPDATE, contact Beatrice Stonebanks at [email protected] or 800-719-8718.