Results 1 to 5 of 5
  1. #1
    Join Date
    Sep 2006
    Posts
    13

    Unanswered: help with full transaction log

    I have to update a table with about 45 mln records which I'm doing on chuncks of 2 to 5 mln records - by indexes IDs. but I get the error
    DB21034E The command was processed as an SQL statement because it was not a valid Command Line Processor command. During SQL processing it returned:
    SQL0964C The transaction log for the database is full. SQLSTATE=57011

    I did stop all processes connected to my database (db2 "force applications (1111,222)") and increased the logfilsiz 20000 and logprimary 20.
    Db2 took the settings - the correct values are displayed in the putput of db2 get db cfg for <dbname> |grep -i log, but I still get the same error even if I try to update a single record from the table.
    What else I could do to resolve the issue?
    Thanks a lot, mj

  2. #2
    Join Date
    May 2003
    Location
    USA
    Posts
    5,737
    That still may not be enough logs for that transaction. A change to the logs does not actually take effect untill the database is inactive (there are no connections to the database). One thing you could do is to run the command without logging.

    1. Create a script as follows:

    alter table <tab-name> ACTIVATE NOT LOGGED INITIALLY;
    update <tab-name> set .... ;
    commit;

    2. Run the script with autocommit turned off:

    db2 +c -tvf scriptname.sql > script.out

    Be sure and backup the database before you start this, because the transaction is not recoverable (major problem if system crashes during the script).
    M. A. Feldman
    IBM Certified DBA on DB2 for Linux, UNIX, and Windows
    IBM Certified DBA on DB2 for z/OS and OS/390

  3. #3
    Join Date
    Jan 2007
    Location
    Jena, Germany
    Posts
    2,721
    Also, you should verify which settings are currently active with:
    Code:
    $ db2 get db cfg show detail
    The SHOW DETAIL clause will retrieve the current and the pending/not-yet-active settings for each DB CFG parameter.

    Note that NLI (not logged initially) comes with a few gotchas. For example, if you have a DB crash and recover from a backup before the update, you can't roll forward through the update - it wasn't logged. So you should take a full backup of your database after the update completed.
    Knut Stolze
    IBM DB2 Analytics Accelerator
    IBM Germany Research & Development

  4. #4
    Join Date
    Mar 2006
    Posts
    104
    Hi,

    Just to rule out the obvious, make sure the application realy does commit for the 2/3 minutes chunks. We had few occassions with log file full errors and the subsequently turned out to be 'commit' not realy done at end of 'commit batche'.

    Regards

    Harikumar

  5. #5
    Join Date
    May 2003
    Location
    USA
    Posts
    5,737
    Quote Originally Posted by sundaram
    Hi,

    Just to rule out the obvious, make sure the application realy does commit for the 2/3 minutes chunks. We had few occassions with log file full errors and the subsequently turned out to be 'commit' not realy done at end of 'commit batche'.

    Regards

    Harikumar
    I think he means chunks of 2-5 million rows, not minutes.
    M. A. Feldman
    IBM Certified DBA on DB2 for Linux, UNIX, and Windows
    IBM Certified DBA on DB2 for z/OS and OS/390

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •