I have a system running on AIX DB2 V9.7 Fixpack 8. It is a BI system with 100 primary logs each 800 mb in size. During peak times of the day we cut 2 - 3 logs a minute for around 20 minutes (these are then archived to TSM without issue) the rest of the day cut about 1 log every 10 minutes or so. I got hit with a question that there is a concern we are cutting to many logs at peak times of day and that we should increase the log file size. I am not seeing any performance impact of cutting the logs and we are never consuming more than 5 or 6 of our primary logs at any given time. Has anyone seen any numbers that support creating larger log files so we do not cut several logs per minute during peak times? Again we are not having any issues where we run out of primary logs or insufficient log buffers, this is just a question is does moving to the next primary log cause a significant performance hit?
if this is a bi system - are there updates/inserts/delete or just refreshing the data
try to load with nonrecoverable
or alter table not logged initially to avoid log usage
Best Regards, Guy Przytula
Database Software Consultant
Good DBAs are not formed in a week or a month. They are created little by little, day by day. Protracted and patient effort is needed to develop good DBAs.
Spoon feeding : To treat (another) in a way that discourages independent thought or action, as by overindulgence.
DB2 UDB LUW Certified V7-V8-V9-V9.7-V10.1-V10.5 DB Admin - Advanced DBA -Dprop..
Information Server Datastage Certified http://www.infocura.be
These are just normal daily inserts/deletes/updates nothing special. The question that came up is there a significant amount of overhead related to switching logs. I personally do not think this is an issue, but wanted to check if anyone had heard of this as an issue. We are not running out of log buffers, primary logs or anything else, we are just filling a log, then moving to the next during peak times this can be a few logs per minute.