The data is dumped and imported fresh every day.
Obviously I don't want my production Transaction Log getting cluttered up with 70 million rows of transaction logs daily.
Is there any way to avoid logging to the transaction log with an SSIS task?Are you using bulk insert or are you running 70 million insert statements? There is a varitety of ways to truncate / shrink the transaction log after you are done processing records.|||
Greg Van Mullem wrote: Are you using bulk insert or are you running 70 million insert statements? There is a varitety of ways to truncate / shrink the transaction log after you are done processing records.
I am going to be using bulk inserts.|||
You can either put your database into simple recovery which will provide 0 recovery and then put back into "full" mode and do a full backup after your finished.
With "simple" you will only be able to recover from your previous backup up to the point you flipped into simple mode
The other option is to use BULK Logged mode. This mode logs extent allocations, once completed you should put your database back into full mode at which point when you back up the log all pages/extents changed during the bulk logged period are written to the backup.
Your transaction log will always grow until a in simple mode a checkpoint occurs at which point it is emptied of commited transactions, or in bulk logged/full the transaction log is backed up. If you have 1 transaction for all the work then the transaction log will need to be big enough to handle all the changes made in the transaction
|||Have you consided solving this problem with a hardware purchase instead of software tweaks? Buy big disk drives and let the transaction log explode in size. If it's too slow then buy faster disk drives or servers.This is often the most cost effective approach these days. More often than not, when I run the numbers on an issue like this, an expensive hardware purchase turns out be cheaper, faster and easier! Just a thought.
Later,
Greg Van Mullem|||
Use the batchsize parameter (I hope bulkinsert has one, as the bcp utility has one). This will leasd into a checkpoint. If you now define an alert for the database (log is used by e.g. 75 percent) and you define a job which will dump the transaction log, you should have no problems with it.
Maybe you have to deal with the percent value. That's it.
The advantage is, that you do not change the recovery model of the database and even your inserts will be part of the backup.
Regards
Norbert
没有评论:
发表评论