Results 1 to 5 of 5
  1. #1
    Join Date
    Aug 2003
    Location
    West
    Posts
    101

    Lightbulb Unanswered: Very large db backup.

    I'm looking for ideas on backup and FAST restore of a 120gb database. The data itself is fairly static. 99% of it only changes on a quarterly basis - and then through a DTS Bulk load. .01% is a user activity table that is added to constantly - that part I'm not so worried about. I can run a complete backup quarterly and want to know what is the FASTEST way to recover a system. Should I detach the db and make compressed copies of database files after each quarterly load? The database has over 50 tables, and each table typically has over 15 indexes, and between 2.5 and 14.5 million rows. Using DTS to copy all objects to a networked server takes 22 hours. Unfortunately that "spare" server can't just sit there and hold the data - I need the space for something else. I'd rather not use tape ......

  2. #2
    Join Date
    Oct 2001
    Location
    England
    Posts
    426
    As you have a lot of static data you should maybe look at filegroups.

  3. #3
    Join Date
    Aug 2003
    Location
    West
    Posts
    101
    Yes, that's a good idea....

  4. #4
    Join Date
    Dec 2002
    Posts
    63
    There are a couple good solutions to your problem, however, all of them tend to cost a little $$$. First, you should check out www.netapp.com They have a snapshot technology that can "snap" a drive, and then restore that snapshot in under 5 minutes reguardless of how much data is there. Their solution, however, is in the hundreds of thousands of dollars.

    Another solution is a log shipping solution. You can have your primary live database on one server, and then a copy of that database in read-only mode on another server. Then you can log ship between the two keeping the "warm" backup behind. (We have our dumps run every 15 minutes and maintain our backup db 1 hour behind) THis way, if your main database goes bad or the hardware fails, you can just apply the remaining logs on the backup and then bring it online. This method works well for most types of disasters. While you incur the cost of a second server, if you are serious about a speedy disaster recovery, then the price tag is worth it. We have a customized version of log shipping, email me if you are interested.

  5. #5
    Join Date
    Jul 2003
    Location
    San Antonio, TX
    Posts
    3,662
    You can also check out SQLLiteSpeed.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •