Unanswered: How to filter event monitor statements by table?
I have created event monitor for statements like:
CREATE EVENT MONITOR event_mon_SQL FOR STATEMENTS WRITE TO TABLE AUTOSTART
SET EVENT MONITOR event_mon_SQL STATE=1
The above event monitor monitors all statements executed to all tables in database. I have a huge performance problems turning this event monitor on, CPU gets to 100% and end-users reports dramatic performance drops.
Actually I only need to monitor statements on 4 not so frequently accessed tables, but I need to monitor all insert/update/delete/select statements. Is there any way I can filter event monitor by only monitoring all statements executed against 4 tables that I need?
Instead of writing the monitor output to a table, write to a file or pipe. This will reduce the impact of the monitor being on. You must then format the output and while doing so, can filter the records.
Depending on your real needs, alternative may be to :
a) for static sql statements, use the catalog tables to get the statements.
b) for dynamic statements, run db2pd for dynamic stmts, once every n minutes.
In this case, you have to filter and do some extra processing with the collected data, but the impact on the system will be low.
Visit the new-look IDUG Website , register to gain access to the excellent content.
I actually need auditing info for this 4 tables. There are very sensitive data in this tables, so I need to monitor who access them. I have looked into db2audit tool and it really looks very complicated to configure (comparing to event monitor that I am already familiar with).
Will writing to file significantly improve performance comparing to writing to table? Has someone measure the impact of writing monitor output to file instead of table?