Results 1 to 12 of 12
  1. #1
    Join Date
    May 2003
    Posts
    5

    Unanswered: DIA3609C Log file was full.

    Hello All,

    I am making Import SAP SYSTEM in a new server, I am getting following error in the process

    2003-05-07-10.38.33.781000 InstanceB2QC2 Node:000
    PID:1296(db2syscs.exe) TID:1632 Appid:*LOCAL.DB2QC2.030507143801
    data_management sqldEscalateLocks Probe:2 Database:QC2

    -- Lock Count, Target : 181916, 90958
    -- Table (ID) Name : (34;4) SAPR3 .GLPCA
    -- Locks, Request Type : 181912, X
    -- Result (0 = success): 0


    2003-05-07-10.41.10.593000 InstanceB2QC2 Node:000
    PID:1296(db2syscs.exe) TID:1632 Appid:*LOCAL.DB2QC2.030507143801
    data_protection sqlpgrsp Probe:50 Database:QC2

    Log Full -- active log held by appl. handle 126
    End this application by COMMIT, ROLLBACK or FORCE APPLICATION.

    2003-05-07-10.41.10.625000 InstanceB2QC2 Node:000
    PID:1296(db2syscs.exe) TID:1632 Appid:*LOCAL.DB2QC2.030507143801
    data_protection sqlpWriteLR Probe:80 Database:QC2
    DIA3609C Log file was full.

    ZRC=0xFFFFD509

    2003-05-07-10.41.10.625001 InstanceB2QC2 Node:000
    PID:1296(db2syscs.exe) TID:1632 Appid:*LOCAL.DB2QC2.030507143801
    index_manager sqlischd.delkey Probe:1974 Database:QC2
    DIA3609C Log file was full.

    ZRC=0xFFFFD509

    __________________

    I increased to max the
    Number of primary log files and Number of secondary log files and the problem persist.

    Thank for you help

  2. #2
    Join Date
    Nov 2002
    Location
    Delaware
    Posts
    186
    You also need to check the size of the logfiles, DB2 7.x has a limit of 32gig max size for logfiles,. You should also check how large this table is, I know in our SAP system it 31 gig. So if I was going to import it into another system, I would you a load command and make it not-recoverable. Then backup the system when your finished. Your message is not very clear in what your trying to do.

  3. #3
    Join Date
    May 2003
    Posts
    5
    Hi:
    I am making "Homogeneous System Copy" To PCL System from QC2 System.
    The size GLPCA Table es 55GB.
    I too increese the size to max of each logs, but the problem persist

  4. #4
    Join Date
    Sep 2002
    Posts
    456
    Can you explain what is going on in the database when this problem is being reported? And what are your settings for logprimary, logsecond, logfilsz, locklist,maxlocks,avg_appls,maxappls database paramters?

    dollar

    Originally posted by jeane
    Hi:
    I am making "Homogeneous System Copy" To PCL System from QC2 System.
    The size GLPCA Table es 55GB.
    I too increese the size to max of each logs, but the problem persist

  5. #5
    Join Date
    May 2003
    Posts
    5
    Originally posted by dollar489
    Can you explain what is going on in the database when this problem is being reported? And what are your settings for logprimary, logsecond, logfilsz, locklist,maxlocks,avg_appls,maxappls database paramters?

    dollar
    Hi:
    Only is working this process in the database, the import database.
    the setting is:
    Logprimary: 30 originaly, after was changed to 118
    logsecong: 10 (this logs never are actived)
    logfilsz: 4095 originaly, after was changed to 65535
    locklist: 3200
    maxlocks : 100
    avg_appls : 1
    maxappls: 60

    thank

  6. #6
    Join Date
    Sep 2002
    Posts
    456
    Sometime the data is too big for the log files to handle, what you want to do in that case is to specify COMMITCOUNT parameter as part of your import statement. This parameter tells the database manager to issue internal COMMIT after n number of rows. Depending on your workload specify a value for COMMITCOUNT e.g. 20000 rows.

    Hope it helps.

    dollar

    note: if you can't make it work, post your import command and will see what we can do.

    Originally posted by jeane
    Hi:
    Only is working this process in the database, the import database.
    the setting is:
    Logprimary: 30 originaly, after was changed to 118
    logsecong: 10 (this logs never are actived)
    logfilsz: 4095 originaly, after was changed to 65535
    locklist: 3200
    maxlocks : 100
    avg_appls : 1
    maxappls: 60

    thank

  7. #7
    Join Date
    Nov 2002
    Location
    Delaware
    Posts
    186
    If your doing a system copy, then I would recommend that you do a re-directed restore. When your doing you transport for the copy, it's too large. We do re-directed restore's all the time, it bring everything back that you'll need.

  8. #8
    Join Date
    May 2003
    Posts
    5
    Originally posted by dollar489
    Sometime the data is too big for the log files to handle, what you want to do in that case is to specify COMMITCOUNT parameter as part of your import statement. This parameter tells the database manager to issue internal COMMIT after n number of rows. Depending on your workload specify a value for COMMITCOUNT e.g. 20000 rows.

    Hope it helps.

    dollar

    note: if you can't make it work, post your import command and will see what we can do.
    Dollar:

    I too think in this alternative, but the used procedure to make this task is single the execution of an EXE file (setup.exe, tools standard of SAP R/3) and there I have not found still where import uses the commando

    Thank

  9. #9
    Join Date
    Aug 2001
    Location
    UK
    Posts
    4,650
    If you have to use some excutable given by the vendor :

    Get in touch with the vendor for his suggestion on how to handle this ... They would have come across another installlation with a similar issue

    or

    Increase the logfiles to the maximum possinle size and number (Don't have the numbers on top of my head)

    or

    May be there is a config file for the executable where you can specify commit counts ...

    Cheers

    Sathyaram
    Originally posted by jeane
    Dollar:

    I too think in this alternative, but the used procedure to make this task is single the execution of an EXE file (setup.exe, tools standard of SAP R/3) and there I have not found still where import uses the commando

    Thank

  10. #10
    Join Date
    Apr 2003
    Location
    Florida
    Posts
    79
    I'm new so carefully examine this suggestion - but it worked for my 20 g table using LOAD utility.....
    Turn on USEREXIT=YES
    SET LOGSECOND TO -1

    This will dynamically allocate / deallocate as much log space as needed.

    Might work for you....

  11. #11
    Join Date
    May 2003
    Posts
    5
    Originally posted by Rick-dba
    I'm new so carefully examine this suggestion - but it worked for my 20 g table using LOAD utility.....
    Turn on USEREXIT=YES
    SET LOGSECOND TO -1

    This will dynamically allocate / deallocate as much log space as needed.

    Might work for you....
    Thank, I try again

  12. #12
    Join Date
    Aug 2001
    Location
    UK
    Posts
    4,650
    A quick note ... Rick's suggestion will work only on V8 ...

    Cheers

    Sathyaram

    Originally posted by jeane
    Thank, I try again

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •