SQL Server Magazine UPDATE—brought to you by SQL Server Magazine
http://www.sqlmag.com and SQL Server Magazine Connections
THIS ISSUE SPONSORED BY
SQL Server Magazine Connections-Save $200
TDWI - The Data Warehousing Institute
(below NEWS AND VIEWS)
SPONSOR: EBOOK: OPTIMIZE SQL SERVER PERFORMANCE
The free ebook "The Definitive Guide to SQL Server Performance Optimization" is now available for download. This ebook describes, in great detail, SQL Server concepts and techniques that can be leveraged by SQL Server DBAs and application developers to optimize response time. Whether your challenge is index usage, SQL statement tuning, lock resolution or simply understanding the dynamics of a SQL Server database ... this free reference book provides the answers.
April 3, 2003—In this issue:
- The Devil's in the DeWitt Clause
2. SQL SERVER NEWS AND VIEWS
- Article Explains Workaround for Changing File Growth Value
- Report: Most Users Don't Trust Microsoft
- Results of Previous Instant Poll: SQL Server Experience
- New Instant Poll: DeWitt Clauses
- Catch Up on SQL Server Web Seminars You've Missed
- Check Out the Database Performance Portal Today!
- What's New in SQL Server Magazine: The 64-Bit Question
- Hot Thread: A Workable Backup Scheme
- Tip: Getting Transaction Information from DBCC OPENTRAN
5. HOT RELEASES (ADVERTISEMENTS)
- Storage World Conference 2003, May 5-7, Anaheim
- SQL Server Magazine University e-Learning Center
- Get the Best SQL Server Resource Available
6. NEW AND IMPROVED
- Extend Analysis Services Functionality
- Monitor Your Database
7. CONTACT US
- See this section for a list of ways to contact us.
(contributed by Brian Moran, news editor, email@example.com)
In my commentary a few weeks ago ("The Truth About the TPC," InstantDoc ID 38348), I briefly mentioned the DeWitt clause that major database vendors such as Microsoft, Oracle, and Sybase include in their End User License Agreements (EULAs). DeWitt clauses forbid the publication of database benchmarks that the database vendor hasn't sanctioned. Here's the exact clause from the SQL Server EULA:
e. Benchmark Testing. You may not disclose the results of any benchmark test of either the Server Software or Client Software to any third party without Microsoft's prior written approval.
Not surprisingly, most readers who sent me comments about these restrictive clauses think that using them is wrong. After all, most of us believe that consumers should have free access to information about the products they use, even if that information isn't flattering to the product or its vendor. I agree. However, I'd like to play devil's advocate and examine the database vendors' position.
A reader from the vendor community sent me a message that sums up the pro-DeWitt argument: "The reason the DeWitt clause is in the EULA is to prevent the publishing and propagation of poor results that result from either intentional or accidental poor configuration of the database system. Requiring the vendor's agreement to publish a benchmark gives the vendor the opportunity to validate the hardware and software configurations to ensure that they are set up in a way that provides optimal performance of the product."
The reader makes a valid point. Two major problems could arise if DeWitt clauses were stricken from EULAs. The first and most obvious problem is that testers could publish inaccurate or misleading benchmarks. For example, tweaking a configuration parameter or adding or removing an index could have a dramatic effect on test results. It's not easy to be a tuning expert, but it's easy to make tuning mistakes. And a group with an agenda could make a subtle mistake on purpose to put one database in a more favorable light than another.
The second problem is that creating a useful database benchmark is hard. Because benchmarks simply measure the performance of a given workload (i.e., the application that's running), it's almost impossible to use the result of a benchmark to gauge the performance of a different workload. Too many factors are different, even when the applications seem similar. I spend a lot of my professional time conducting performance-tuning audits for clients. In nearly every case, the application—not the database—is what causes performance problems. Although I focus on SQL Server, I'm certain this observation holds true for other major database systems.
Because poor application design can easily change benchmark numbers, a database might be properly tuned for a given benchmark, but that doesn't mean that you can use the benchmark scores to make accurate capacity-planning decisions for your application. In addition, modern application-development environments and middleware have grown increasingly complex. A middleware-layer application might efficiently access data through one database engine but be inefficient in accessing data through another database engine. Benchmarks performed absent a well-defined methodology that accurately reflects database performance (instead of other factors) could degrade into application-design benchmarks, which don't serve consumers if taken out of context.
Database vendors have a vested interest in ensuring that customers don't receive misleading benchmark information. The DeWitt clause lets vendors ensure that misleading benchmark numbers based on poorly tuned or configured systems don't sully their product's image. In some ways, this kind of control also serves the best interests of the customer; the two benchmarking problems above would lead to benchmarks results that could cause customers to make misinformed decisions. Is the DeWitt clause good for database vendors? Maybe. However, the question should be, What's best for the consumer? I'd argue that focusing on the consumer is ultimately in the best interest of the vendor anyway. Next week, I'll share reader comments that are anti-DeWitt and explain why I think the benefits of eliminating DeWitt clauses would more than outweigh the drawbacks.
SPONSOR: SQL SERVER MAGAZINE CONNECTIONS-SAVE $200
Register today for SQL Server Magazine Connections, and you'll save $200—plus you'll get FREE access to concurrently run Microsoft ASP.NET Connections and Visual Studio Connections. Attendees will also get a chance to win a brand-new Harley-Davidson motorcycle. Don't miss this exclusive opportunity to learn from and interact with your favorite SQL Server Magazine writers and Microsoft product architects. After hours, unwind at events like "Microsoft Unplugged," where no question is out of line, or march in the Mardi Gras Parade to the House of Blues for a night of fun. Easy online registration is available at:
2. SQL SERVER NEWS AND VIEWS
A recent Microsoft article explains how to work around a SQL Server 2000 problem that occurs when you change the file growth value for the tempdb database from fixed increments to percentage. The article, "PRB: File Growth Value for TempDB is Not Persistent When Changed From Fixed Increments to Percentage," says that when you make the change and restart SQL Server, the file growth value still appears in fixed increments (for example, 80KB). The article explains the problem and the workaround, which involves updating the status column in the master database's sysaltfiles table.
(contributed by Paul Thurrott, firstname.lastname@example.org)
A recent Forrester Research survey brings an ugly truth to the forefront: Most IT administrators who work with Microsoft products don't trust the company or believe it can produce secure software. The survey, which polled security experts at $1 billion companies, cites some interesting statistics: Of those polled, 77 percent said they don't trust Microsoft, yet 9 out of 10 still deploy Microsoft software in mission-critical applications. This news should certainly bolster competitors such as Linux. But the question remains: What can Microsoft do to reverse this trend, and how much time will pass before the company's fortunes reflect the growing industrywide unease about the quality of its wares?
Fifteen months after the launch of its much-vaunted Trustworthy Computing campaign, Microsoft security still has a black eye, its critics charge, although the company correctly argues that the results of its security initiative aren't immediately obvious. "We understand that achieving the goals of Trustworthy Computing will not be an easy task and that it will take several years, perhaps a decade or more, before systems are trusted the way we envision," a Microsoft spokesperson said. "We are working to address existing security concerns, including patch management. This is only the beginning, and we are confident that customers will continue to see additional progress over time."
Another issue is administrator responsibility. Microsoft had previously patched most of the worst vulnerabilities that attackers exploited in recent years. As the report notes, "Too few firms are taking responsibility for securing their Windows systems"; instead, they blame Microsoft for their woes. The recent SQL Slammer worm is a classic example. The company had issued several fixes for the vulnerability the worm used, and if SQL Server administrators had kept their systems up-to-date, the worm wouldn't have been so devastating. The report states that Microsoft released patches for the last nine "high-profile Windows security holes" an average of 305 days before any attack took place, but administrators often didn't install the updates. In other words, most security snafus are avoidable.
But, as any Windows administrator can tell you, Microsoft's convoluted patch-management system is in dire need of an update—each product the company releases seems to follow its own update regimen. Recent advances in the company's Windows Update and Auto Update software should merge into Microsoft's other products soon and give the company a centralized and automated way to keep all its software updated. In the meantime, administrators are forced to wrestle with the myriad ways they receive bug notifications, install updates, and keep systems running smoothly. And the fact that many patches require system reboots doesn't help.
Looking forward, Windows Server 2003 will be the first big test for Microsoft's security initiative, as the OS will be the first major product the company has shipped since it embraced Trustworthy Computing. However, analysts say that Windows 2003 uptake is expected to be slow for a variety of reasons, including the war with Iraq, the continually stumbling economy, and an impression that the product is just a minor upgrade to Windows 2000. The Yankee Group says that only 12 percent of current Windows Server users plan to upgrade to Windows 2003 this year, down from the 30 percent who upgraded to Win2K Server within the first 12 months of its release.
One of the biggest reasons to upgrade to Windows 2003, however, is better security. Whether selling an upgrade based on its security prowess compared to the previous release is a good idea is debatable, but the first several months of general availability might be telling for Windows 2003. If customers embrace the product and it withstands months of uptime with little or no security vulnerabilities, Microsoft will have gone a long way toward repairing its reputation. But if Windows 2003 suffers the same sort of security embarrassment that Windows XP did with its high-profile (yet low-impact) Universal Plug and Play (UPnP) vulnerability, customers might view the product as more of the same. And more of the same isn't the message that Microsoft is trying to convey.
The voting has closed in SQL Server Magazine's nonscientific Instant Poll for the question, "How long have you worked with SQL Server?" Here are the results (+/- 1 percent) from the 862 votes (deviations from 100 percent are due to rounding errors):
- 16% 1 year or less
- 28% 1-3 years
- 38% 3-6 years
- 13% 6-9 years
- 6% 10 years—I'm a veteran
The next Instant Poll question is "Are you for or against DeWitt clauses, which forbid publication of database benchmarks that the database vendor hasn't sanctioned?" Go to the SQL Server Magazine Web site and submit your vote for 1) I'm for the clauses, 2) I'm against the clauses, 3) I can see both points of view, 4) I need more information to decide, or 5) I don't care about the clauses.
SPONSOR: TDWI - THE DATA WAREHOUSING INSTITUTE
TDWI World Conference - San Francisco, May 11-16, 2003. BI & DW Professionals can take advantage of more than 50 courses taught by highly sought-after industry gurus. Don't miss out on in-depth training, industry trends, and the latest technologies! Hot Topics include: Web Services & XML for Business Intelligence, Evaluating ETL and Data Integration Platforms, Revitalizing the Mature Data Warehouse, Performance & Capacity Management, and more! Register now and save! For a complete brochure, visit:
(brought to you by SQL Server Magazine and its partners)
The finest SQL Server instruction is accessible right at your computer desktop. SQL Server Magazine University e-Learning Center has an entire lineup of 1-hour Web Seminars that are available live and through online archives. Subscribers to SQL Server Magazine pay only $59. Get all the details now at
SQL Server Magazine and CSA Research have recently introduced the Database Performance Portal. IT professionals use the Performance Portal to conduct client, server, and network scalability studies; perform ad hoc systems health analysis; identify infrastructure bottlenecks; conduct offsite diagnostics; and qualify new hardware purchases. To visit the portal, go to
Since the release of SQL Server 7.0 in 1998, SQL Server has been on the enterprise fast track, easily clearing the hurdles that hindered the adoption of earlier SQL Server versions in the enterprise. Support for a new scale-out technology called distributed partitioned views boosted SQL Server to the top of the TPC-C rankings for clustered database systems. SQL Server also broke into the TPC-C top 10 for nonclustered systems but fell short of the best scale-up database solutions by IBM and Oracle, which ran on more powerful hardware. Even so, SQL Server has continued to make significant gains in scalability, and the new SQL Server 2000 64-bit Enterprise Edition (formerly code-named Liberty) moves SQL Server even closer to the peaks of enterprise scalability. In his April 2003 SQL Server Magazine article "The 64-Bit Question," Michael Otey looks at SQL Server 64-bit's features and explores when deploying this powerful new platform, expected to be available this month, makes sense. Read the full article online at
Mikenelson is working on a SQL Server 7.0 project that has been in staged production for several months and is now moving into full production. His company has relied on daily full backups because the data volume has been low—a situation that will change as the project moves into full production. The database team tried to set up log shipping replication to a second server but discovered unlogged transactions in many stored procedures (the shop has many stored procedures that use SELECT INTO and that aren't logged in the transaction log). If log shipping replication would have worked, the team planned to do a full backup every night and transaction log backups every 20 minutes as sources for log shipping. When team members discovered the unlogged transactions problem, they decided to investigate replication. However, their database schema isn't stable, and each schema change would break replication. Would a backup strategy of a full daily backup and transaction log backups that fall through to a differential backup upon failure work? Offer your advice and read other users' suggestions on the SQL Server Magazine forums at the following URL:
(contributed by Brian Moran, email@example.com)
Q. How can I identify the longest running transaction in my database? For example, I want to find out which process has held log space in a database for the longest amount of time.
A. The DBCC OPENTRAN command gives you the information you need by displaying information about the oldest active transaction and the oldest distributed and nondistributed replicated transactions within the specified database. The command displays results only if an active transaction exists or if the database contains replication information. The command simply displays an informational message when no active transactions exist. I've found that glancing through all the commands that SQL Server Books Online (BOL) documents is a few hours well spent. You might not memorize all the commands, but you'll likely recall the appropriate ones when you run across a particular problem.
Send your technical questions to firstname.lastname@example.org.
5. HOT RELEASES (ADVERTISEMENTS)
Attend the key event of the year for the Storage Networking Industry. Hear keynote addresses from leading companies, attend tutorials on strategic issues, view a showcase of the latest technologies, and get certified! Register now:
SSMU's instructors bring you the finest SQL Server training available! Whether you're advanced or just beginning, you'll find training to meet your needs. Events are delivered online - LIVE via the Internet! Click here now:
Subscribe now to SQL Server Magazine and receive 12 information-packed issues delivered to your doorstep or your desktop. PLUS, you'll get FREE access to the SQL Server Magazine article archive on the Web!
6. NEW AND IMPROVED
(contributed by Carolyn Mader, email@example.com)
Panorama Software announced NovaView 3.2, analytical desktop software that reveals the full functionality of SQL Server 7.0 OLAP Services and SQL Server 2000 Analysis Services. Enhancements include budgeting and forecasting write-back functions, drill-through capability, data mining, custom roll-ups for financial analysis and reporting, dimension- and cell-level security, ragged-dimensions support, and server-based named member sets. NovaView features OLAP functionality including reporting and highlighting, trend analysis, and hide and unhide capabilities. For pricing, contact Panorama Software at 805-980-4467.
Pearl Knowledge Solutions announced SQLCentric, a Web-based, SQL Server-centric network database monitoring system that you deploy on your company's intranet. If any SQL Server or SQL agent in your network goes down or if drive space drops too low, SQLCentric will notify you by email or pager. SQLCentric features follow-up messaging, server grouping, and realtime drilldown. The Web interface uses color-coding so that you can click any status light for full details. Pricing is per monitored server. Contact Pearl Knowledge Solutions at 917-499-7622.
7. CONTACT US
Here's how to reach us with your comments and questions:
- ABOUT THE COMMENTARY — firstname.lastname@example.org
- ABOUT THE NEWSLETTER IN GENERAL — email@example.com
(please mention the newsletter name in the subject line)
- TECHNICAL QUESTIONS — http://www.sqlmag.com/forums
- PRODUCT NEWS — firstname.lastname@example.org
- QUESTIONS ABOUT YOUR SQL SERVER MAGAZINE UPDATE SUBSCRIPTION?
Customer Support — email@example.com
- WANT TO SPONSOR SQL SERVER MAGAZINE UPDATE?
More than 102,000 people read SQL Server Magazine UPDATE every week. Shouldn't they read your marketing message, too? To advertise in SQL Server Magazine UPDATE, contact Beatrice Stonebanks at firstname.lastname@example.org or 800-719-8718.
SQL Server Magazine UPDATE is brought to you by SQL Server Magazine, the only magazine completely devoted to helping developers and DBAs master new and emerging SQL Server technologies and issues. Subscribe today.
The SQL Server Magazine Connections conference—loaded with best-practices information from magazine authors and Microsoft product architects—is designed to provide you with the latest SQL Server tools, tips, and real-life examples you need to do your job.
Receive the latest information about the Windows and .NET topics of your choice. Subscribe to our other FREE email newsletters.
Thank you for reading SQL Server Magazine UPDATE.