i have a rather large database that i need to have backed up and because of its size it has become rather difficult to backup. unfortunatelly i dont really know DB2 at all so that makes this even harder.
basically i have a DB backup that is about 800GB and they want to back this up twice weekly and move it to remote storage. right now there is a script that does the db backup and then i manually move it with eseutil to our storage system.
how would you guys handle this?
what i was thinking was to make a script that does the backup, ships it over when its done and does a check of the backup when its done to verify it was moved ok. I think i should be able to do this pretty well with PowerShell
the one thing i saw in the old script was some weird stuff with transaction logs. i was able to make sense of everything else but i dont feel comfortable that i know how these are handled.
My guess is that you are doing ONLINE backups, otherwise you would not be messing with the logs. Unfortunately in V7, you have to deal with the db2 logs when performing an online backup, otherwise the backup image is worthless. This is to ensure that you can restore and rollforward the DB if needed.
There are some alternatives (although in V7 I am not sure what is supported anymore), like backing up directly to TSM, or other third party backup solution.
You should look seriously in upgrading DB2. The current version is 9.5. Although to get there you will have to upgrade to V8.2 first. V7 is no longer supported by IBM, and has not been for a couple of years. V8 and beyond give you more options using BACKUP like compressing the image and including the logs in the image.
sadly the DB2 is part of an application and the vendor is a touch behind on that stuff. the new version of their product goes to DB29 i think and we have plans to upgrade their product but thats a ways out so in the mean time i have to deal with this :-/