Results 1 to 3 of 3
  1. #1
    Join Date
    Nov 2004
    Location
    Canada
    Posts
    58

    Unanswered: Deployment Thoughts

    Thoughts and recommendations advise please???
    Maybe I am missing something huge that someone can shove me off in the right direction.

    From time to time I need to enhance/clean revise current data within sql for a project or enhancement request. Or I need to build new features onto an existing app that could change backend data.

    For this I need to backup a couple of tables and there data in case I need to restore or analyze it later. I don't want to backup up the entire database for only a couple of tables.

    I know that I can create a replication snapshot and I can create an sql script to backup specific tables and data. I know I can create a sql backup of my database. But what I cannot do is restore single tables from a backup sql database. When will that be available or is it and I have just been doing this the hard way. Sometimes when I have a database that is huge I don't want to take a backup and have to restore the entire database just for a couple of tables and hunt around for a server with enough space to do the restore.


    Currently for each project I create a package or script to backup the tables I need to another table with a timestamp in the table name. Ie. backup_mytable_20090518 Then I have the data on hand in case I need to put something back in place fast in case my updates/enhancments change something that they shouldn't. I do this as a precaution more so, luckily my deploys normally go pretty smooth, and I have never really had to do a restore, but I am glad i have the data on hand because end users always believe the data was not what it is and I have to pull up the backed up data so they can physically see it was always the way it appeared and the new code did not change what was there. Hopefully you know what I mean.

    It is somewhat frustrating. so I ask the DBA gods out there is there a fast method of doing this. I work on many projects and applications that are not always related, and it would be slick if i had a tool that identified based on my procedures and table scripts in source safe that were modified in the last x days and automatically take a backup before I run them within the deployment server.

    Like for example i have two tables that I am modifying and one new procedure and one existing procedure. based on the sp_depends it identifies all of the tables that could be affected. As part of my deploy it looks at these identified tables and automatically creates a list of the tables with "Backup" + table + datetime. From there i would check off the tables that I do want backed up and give a drop time ie. x (2) days to drop the back up tables. Then it backs them up, then I execute my deploy scripts for the new modifications. voila. Two days later when there are no issues it automatically drops the backed up tables. (x 2 days could also be month end so the backups make it to the month end backup etc ) p.s this tool would also check space availability and cleanup when necessary.


    Our test/stage server and development server never match our production environments because of multiple projects and limitations to server space so some data issues are near impossible to find until it hits production and because no one wants to take this sync project on.

    Should I create my own application for this or does someone know of something out there?

    Replication is not available on some servers by the way.
    Last edited by burkular; 05-18-09 at 13:00.

  2. #2
    Join Date
    Feb 2004
    Location
    In front of the computer
    Posts
    15,579
    Provided Answers: 54
    The ability to restore individual tables was included with SQL Server 6.5, then removed in either SQL 7 or SQL 2000.

    I do restores of individual tables in both SQL 2005 and SQL 2008 using LightSpeed (a product that I love). Lightspeed isn't terribly cheap, but it is wickedly effective and I consider it a "must have" tool for large or complex database environments.

    With some planning, youi can create filegroups that will segregate the table or tables of interest, and back up that filegroup to effectively back up those tables separately. This isn't the same thing as being able to restore a give table or tables from a full database backup, but it is certainly better than nothing and it doesn't cost anything beyond the cost of your copy of SQL Server.

    -PatP
    In theory, theory and practice are identical. In practice, theory and practice are unrelated.

  3. #3
    Join Date
    Nov 2004
    Location
    Canada
    Posts
    58
    I have used litespeed in our test environments, we never promoted it to our production servers because we ran across some issues with never ending jobs from litespeed where we had to physically shut down the sql agent to end the run-on jobs because no kill process would shut it down. We never could figure out why certain maintenance plans in our test environment would react that way. Regardless we could not take the risk in our production environments. Thanks for the reply none the less. I have also worked with individual filegroups for certain tables especially our larger tables definitely sit within their own filegroups which makes life easier in some respect.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •