Results 1 to 6 of 6
  1. #1
    Join Date
    Sep 2006
    Posts
    17

    Unanswered: Trouble Backing-up all databases to network folder using a Maintenance Plan

    I have set up a Database Maintenance Plan that does a nightly backup of all of my databases (about 12 of them) to a network folder. The plan works for about 95% of the job, but most nights there will be at least 1 database which will fail with the following error...
    BackupDiskFile::RequestDurableMedia: failure on backup device '\\myfileserver\Backup\SQL\Database\DatabaseName\D atabaseName_db_200610081749.BAK'. Operating system error 64(The specified network name is no longer available.).
    I know that this is not a permission or storage problem, because it works for most of the job. And a database that fails one night, may work fine the next night only to have a DIFFERENT database fail that night, and sometimes all databases work 100%.

    Is there a way to fix this problem? And if not, is there a way to be notified of which specific database in the maintenance plan is failing. The message on the job itsself is very non-descriptive and I have to manually search the logs to find out which databases were successful & which where not. It is very time consuming. Any help with this would be greatly appreciated.

  2. #2
    Join Date
    May 2004
    Location
    Seattle
    Posts
    1,313
    It looks like a network error, not a sql error. I have seen that error when \\myfileserver drops off the network.

    Do you have enough disk space on your sql server to backup to the local disk, then move the backups to the network share via xcopy? That might be faster anyway. It wouldn't get around the problem of the network going down, but at least the backups would get created...

  3. #3
    Join Date
    Sep 2006
    Posts
    17
    I agree, this is definately a network problem. One that is intermittant (sp?). As I mentioned, this is not a permission or storage problem though. The network drive has plenty of space for this. Unfortunately, the reason I had to go with the network backup in the first place was because I don't have enough free space for all of the backups to fit on my SQL server.

    I suppose that Backing-up locally and then copying to the network could work if I had a script that quickly deleted it off the local drive right after the copy. I simply cannot store a weeks worth of backups on the SQL server. Only 1 day or maybe 2. The trouble with this is that I can't rely on SQL's nice handling of the older files, ie removing them after 7 days. And if I do this manually & mess something up, I could quickly run out of room on my database server which would be much worse than having the odd missed backup.

    If an xcopy/purge script needs to be created, I guess I'll have to work on that. But I would love to hear another idea. Even if I could have a more accurate success/fail report, that would probably be sufficient. Like I say, each individual database seems to be working the majority of the time. I just need to know when a particular database fails rather than just seeing that the job as a whole, failed.

  4. #4
    Join Date
    Jan 2004
    Location
    In a large office with bad lighting
    Posts
    1,040
    do you have too much space occupied because of the maint plan characterisics? If you have a retention period set to 1, then SQL Server will ensure the current backup completes before deleting the previous one.

    If you have enough room for a single backup of each database, then create a job that writes to a backup device with init = True. Once those steps complete, then you can xcopy the files off (added benefit ... they continue to have the same file name). You could even add raiserror steps inthe job schedule to write an alert to the log if the backup of a specific database fails, and then you can review the log in the morning!

    -- This is all just a Figment of my Imagination --

  5. #5
    Join Date
    Jun 2003
    Location
    Ohio
    Posts
    12,592
    Provided Answers: 1
    You should not be backing up directly to a network device. As you have discovered, this adds a new point of failure to the process.
    Instead, backup to a local drive and then copy the files to the network. That way you still have recoverability even if your network fails.
    If it's not practically useful, then it's practically useless.

    blindman
    www.chess.com: "sqlblindman"
    www.LobsterShot.blogspot.com

  6. #6
    Join Date
    Sep 2006
    Posts
    17
    Thanks for the suggestions guys. You're absolutely right.

    I'm in the process of building a more reliable testing server. When I've got that set up, I will try this out. I may have more specific questions at that time regarding the syntax of this script.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •