Results 1 to 4 of 4
  1. #1
    Join Date
    Dec 2007
    Posts
    288

    Unanswered: log blowing up on huge insert operation

    We are doing a one - time insert into a database and the log is growing to 15GB using ALL of the free space on the drive and blowing up.

    The insert is being done with a third party ETL tool - reading our datamart (15 million rows) and then doing inserts on one column in select rows our sql server DB. For each insert into the sql server DB, there is a trigger that writes to another table.

    Will simple recovery mode help me here?? Bulk-logged??

    any suggestions?

  2. #2
    Join Date
    Jan 2003
    Location
    Massachusetts
    Posts
    5,799
    Provided Answers: 11
    You will likely need to break the 15 million rows into chunks.

    Run a chunk, then backup the log, or switch to simple recovery mode, making sure you have full backups of before, and after.

  3. #3
    Join Date
    Jun 2004
    Location
    Long Island
    Posts
    696
    Can you insert without the trigger, then do what the trigger does via a stored procedure (so it does it in batches as opposed to one record at a time).

    if you can do this, I would bulkcopy the data into your table, and specify desired batchsize, then via a sproc do what the trigger does in batches.
    Last edited by PMASchmed; 06-26-09 at 11:54.

  4. #4
    Join Date
    Dec 2007
    Posts
    288
    Thank you both. The ETLDeveloper did chunk up the inserts and it worked great.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •