One of my company's applications is running on Informix. I'm in the process of migrating the Informix data to SQL Server 2000. For testing purposes, I created a package that reads records from Informix and writes them into a SQL Server table. Before I copy the original records, I delete them. Here's my problem: I find that although the actual database size is only about 400MB, after 2 weeks, the database size grows into gigabytes and the log file size is huge. And after several days, the copy_records job fails because the hard disk is full. How can I resolve this problem? Is the database large because I'm dropping and recreating the table? Instead of dropping the table, should I use TRUNCATE to clear the table, then insert the records?
Truncating the table would save a little space in the log file because SQL Server doesn't log the delete operation of all the rows. However, you probably need to make log backups to clear out the transaction log every couple of days. To keep your database from ballooning, you need to run DBCC SHRINKDATABASE periodically.