Results 1 to 8 of 8
  1. #1
    Join Date
    Mar 2004
    Posts
    15

    Unanswered: Log getting too big

    Hello,

    I have a local database that is refreshed overnight. A DTS batch grab several views from another database and makes local copy locally. This process creates a hige amount of unnecessary lines in the log file and sometimes so much that the entire available disk space is used, thus crashing the DTS batch. This database is mainly static and non transactional. I wonder if its possible to disable alltogether the logging process for that particular db but I know its part of SQL server so it can't be that simple.

    If I limit the size of the log file it gets cluttered and the same problem that if disk space was all used occurs... what should I do ???

  2. #2
    Join Date
    Nov 2002
    Location
    Switzerland
    Posts
    524
    Not possible to skip the log
    But possible to clean it (via backup, or via sp_dboption YourDB, trunc, true, ...)

  3. #3
    Join Date
    Jul 2003
    Location
    The Dark Planet
    Posts
    1,401
    You can set the recovery model of the database to simple
    Get yourself a copy of the The Holy Book

    order has no physical Brett in The meaning of a Kaiser . -database data

  4. #4
    Join Date
    Jul 2002
    Location
    Village, MD
    Posts
    621
    Originally posted by Enigma
    You can set the recovery model of the database to simple
    ... and keep DTS transactions as short as possible - every checkpoint will clean up the transaction log.

  5. #5
    Join Date
    Mar 2004
    Posts
    15
    Model recovery already set to simple...

    added a SQL task at the end of my batch that does this :

    backup log someDatabase with truncate_only;
    dbcc shrinkfile (someDatabase_log);

    but still my log is several gigabytes big ... maybe Ill add that SQL task after each import tasks...

    *sigh*

  6. #6
    Join Date
    Feb 2004
    Location
    PAKISTAN
    Posts
    106
    Simple-recovery model logs each transaction and clears the log at each CHECKPOINT that's why during ur DTS it gets too big. Try the Bulk-Log recovery model, for it just skips BCPs and SELECT INTO etc. During ur DTS the log may fill-up less than the Simple-Recovery Model. Bulk-Log requires Transaction-Log backups to clear older records.

  7. #7
    Join Date
    Mar 2004
    Posts
    15

    Thumbs up

    I'll give a try at your clever solution... makes sense to me... Ill post a follow up monday if everything went ok (or not). For the past 4 weeks my batch systematically crashed on week-ends.

    Thank you all for your time and patience!

    Have a good w-e (6pm in europe already...)

  8. #8
    Join Date
    Nov 2002
    Location
    Jersey
    Posts
    10,322
    Anyone want to scream bcp?
    Brett
    8-)

    It's a Great Day for America everybody!

    dbforums Yak CorralRadio 'Rita
    dbForums Member List
    I'm Good Once as I ever was

    The physical order of data in a database has no meaning.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •