Results 1 to 10 of 10
  1. #1
    Join Date
    Nov 2005
    Location
    Gex - France
    Posts
    15

    Unanswered: how to trigger a job after a logfile switch

    Hi all,

    I have a 10gR2 database in archive mode running on an AIX Unix system.
    Log files are archived in a local directory.
    For disaster recovery purpose, I need to copy archived log files to a remote host as soon as Oracle has finished creating them.

    I could write a shell script that scan the directory where archive log files are created. Issue is that it will not detect file creation as soon as it is done, but only when the cron will launch it.

    So, Im looking for an Oracle solution. Something like a trigger on a system table (not my preferred solution) or any kind of event generated by oracle that can be handled by the oracle scheduler (preferred solution).
    In the submitted oracle job, I will have to call a script that copies the file to the remote host.

    Do you have any idea on how I can implement that?

    Thanks in advance,
    Patick

  2. #2
    Join Date
    Jun 2003
    Location
    West Palm Beach, FL
    Posts
    2,713

    Talking Standby database.

    Create standby database on remote server and the logs can be shipped automatically.
    The person who says it can't be done should not interrupt the person doing it. -- Chinese proverb

  3. #3
    Join Date
    Jul 2003
    Posts
    2,296
    NFS mount the remote host filesystem and set an additional archivelog location.
    For DR, as mentioned by LKB, you should really setup Dataguard.
    That would solve the whole situation.
    - The_Duck
    you can lead someone to something but they will never learn anything ...

  4. #4
    Join Date
    Nov 2005
    Location
    Gex - France
    Posts
    15
    Hi LKBrwn and The Duck,

    Thanks for your answers.

    I have already tested the standby database with DataGuard, and it perfectly works. But in this particular case, I’m not allowed to create a database on the remote host.
    So, NFS could be a good solution. But:
    The Duck, have you ever used the NFS solution on production systems?
    What about performances?
    I will set this log_archive_dest_n as optional in case of NFS/network break down.
    I’m wondering if this solution is robust since I’ve found a couple of bug reports about NFS in Metalink.
    Is there any risk that, because of network bottleneck, Oracle will not be able to write some archive log files on the NFS mount point?
    And, if for any reason, one archive log file is not created on the NFS mount point, is there any possibility to recreate it?

    This is because of all these questions that I do not really trust the NFS solution and that I am looking for another one (like Oracle events). I hope that this other solution will let me have more control on how to transfer files and how to make me sure that the transfer is well done.

  5. #5
    Join Date
    Jun 2003
    Location
    West Palm Beach, FL
    Posts
    2,713

    Cool Nfs?

    If archive log is not created on NFS, you could copy it from primary location or recover from RMAN backup.

    And/Or you can use the resync command to do it.
    The person who says it can't be done should not interrupt the person doing it. -- Chinese proverb

  6. #6
    Join Date
    Jul 2003
    Posts
    2,296
    NFS mounting can be setup to re-mount upon startup.
    If you are having network problems then the NFS mount will be the least of your worries
    (no one will be able to get to your database).
    I've done a standard NFS mount which worked fine.
    I've also had admins setup NAS filesystems and present them to multiple RAC nodes so
    they are viewable and writable from all nodes and used that as my archive destination for a RAC setup.

    BTW - do you do RMAN backups to tape or to disk?
    You could use RMAN archivelog backup in the same way to accomplish your goal(s).
    - The_Duck
    you can lead someone to something but they will never learn anything ...

  7. #7
    Join Date
    Nov 2005
    Location
    Gex - France
    Posts
    15
    Let me tell you a little bit more about my trouble.

    On server A:
    - There is the database
    - An RMAN backup (using the no catalog option. i.e. no RMAN database) is run once a day.
    - When done, the backup is copied to server B
    - Last X daily backups are kept on disk

    On server B:
    - Database daily backups is kept on disk waiting for some other backups (OS, applications…).
    - Backups are moved to tapes.

    Now, let assume that the last backup was done at midnight.
    At 5pm, a couple of Murphy’s laws make all the disks out on server A.
    My customer wants to restore the most possible amount of data.
    But because of the disk crash, I have no more archived log file available for a point in time recovery. I can just restore the daily backup downloaded from server B. and my customer will not be happy to lose 9 hours of work.

    So, I’m wandering how to secure the archived log files between 2 daily backups.
    Last edited by pmo; 09-14-09 at 11:33.

  8. #8
    Join Date
    Jun 2003
    Location
    West Palm Beach, FL
    Posts
    2,713

    Cool Separation of duties er...disks?

    a) Backups should be on different disk apliances than the database.
    b) If database disk array fails, you have the primary backups on server 'A' that should be up to date.
    c) If backup disk array fails, the database is already up to date and should not be affected.
    d) If both database and backup disk arrays fail, that is when you would need to use the backups on Server 'B' to recover.

    To reduce the recovery gap, you need to add a backup of the archive logs and schedule for every (perhaps) 15 minutes in order to move these archive logs to tha backup disks.

    Then you can use the same script to copy the backups to Server 'B'.
    The person who says it can't be done should not interrupt the person doing it. -- Chinese proverb

  9. #9
    Join Date
    Aug 2003
    Location
    Where the Surf Meets the Turf @Del Mar, CA
    Posts
    7,776
    Provided Answers: 1
    >So, Im wandering how to secure the archived log files between 2 daily backups.
    Modify current environment as though Data Guard is being deployed.
    Specify log_archive_dest_2 to be on ServerB to have Oracle automagically ship a copy of the archived redo log file to ServerB at the same time it is archived on ServerA.
    You can lead some folks to knowledge, but you can not make them think.
    The average person thinks he's above average!
    For most folks, they don't know, what they don't know.
    Good judgement comes from experience. Experience comes from bad judgement.

  10. #10
    Join Date
    Nov 2005
    Location
    Gex - France
    Posts
    15
    @LKBrwn:
    Backing up archive log files only sounds nice to me. I will investigate.

    @Anacedent:
    Using DataGuard means that Oracle must be installed on both systems. Isn't it?
    But Orcale is not installed on system B where I must copy my backups.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •