I'm using transactional replication with SQL Server 2000 Standard Edition Service Pack 4 (SP4) on Windows 2000 Server.The replication worked fine until the night we tried to archive more than 37 million records from one of the replicated tables. We used a single transaction, employing the syntax DELETE WHERE . I think SQL Server hit its limit with the 37 million items: It returned the error The process could not execute 'sp_replcmds' on . Can you recommend a way to safely perform such a large update?
In SQL Server 2000, the best way to replicate very large set updates is to replicate the execution of a stored procedure that performs the deletion. Instead of replicating 37 million statements, the procedure replicates just one procedure call to the subscriber.You can find an explanation of how to use this technique in SQL Server 2000 Books Online (BOL) under Replication, Implementing Replication, Publishing Data and Database Objects, Publishing Stored Procedure Execution.
Another option that's available in SQL Server 2000 SP1 and later is a new parameter on the log reader called MaxCmdsInTran. As BOL explains,"MaxCmdsInTran specifies the maximum number of statements grouped into a transaction as the Log Reader writes commands to the distribution database. Using this parameter allows the Log Reader Agent and Distribution Agent to divide large transactions (consisting of many commands) at the Publisher into several smaller transactions when applied at the Subscriber." When you specify this parameter, you can reduce contention at the Distributor and also reduce latency between the Publisher and Subscriber.
Note that SQL Server 2005 includes a fix that lets it handle such large updates without timing out, although both of the SQL Server 2000 options I've presented here are still available to provide optimal efficiency.
Platform Program Manager
Microsoft Project Team