Results 1 to 7 of 7

Thread: finetuning

  1. #1
    Join Date
    Jan 2009
    Posts
    4

    Unanswered: finetuning

    am currently migrating data from ( one database table) to another database.I need to validate each row while migrating table (both database are on same server)


    One table is having around 5715371 records I have to check each and every record



    I tried with following 3 methods


    1) cursor pick all columns in cursor and processes

    2) set row count to 1 and process all rows

    3) Bulk copy around 5000 records in temp table each time and process (fastest)

    can anyone suggest fintuning this

  2. #2
    Join Date
    Mar 2007
    Posts
    72
    Hi,
    What do you mean by validate each row, do you have to select each row based on a criteria, if so why don't you create a view with the criteria then bcp out and bcp into the new table. cursors can be slow at the best of times and for a table with 5 mil plus rows, nah nah.

    suda

  3. #3
    Join Date
    Jan 2009
    Posts
    4

    explanation

    I need to check value for example value is 3000 then i need to set 10
    i am trying to avoid this now which is best method

  4. #4
    Join Date
    Jun 2007
    Location
    London
    Posts
    2,527
    Quote Originally Posted by salilkol
    I need to check value for example value is 3000 then i need to set 10
    i am trying to avoid this now which is best method
    Why are you changing the value 3000 to 10? Is it just the FK id's in the new database are different? or do they not exist for some items and need to be created? Why were you trying to do things one record at a time with the cursor rather than doing all records at once with a singe update/insert?

    You have to produce some concrete examples that make sense otherwise we just play a guessing game trying to work out what you are trying to do and any solution you'll be given will be questionable.

  5. #5
    Join Date
    Jan 2009
    Posts
    4

    explanation

    Ok in short scenario is like follow I am getting flat file which having huge number of records now I have to map each record into my database here problem I am facing for example my field width is smallint ( part of primary key) and data coming is huge
    Also this file is having some relation with other file ( Fr. Key) so I have to track all changes I am doing on table to map another table
    Sample
    File 1 coming A|345678|1| XXXX| XXXX|
    A|345678|2| XXXX| XXXX|
    File 2 (fr key relation
    A|345678| YY|
    A|345678| NN|

    I have to convert it into
    A|1|1| XXXX| XXXX|
    A|2|2| XXXX| XXXX|

    File two

    A|1| YY|
    A|2| NN|

  6. #6
    Join Date
    Jun 2007
    Location
    London
    Posts
    2,527
    I think I'd just:
    • Create copies of the tables on the new server.
    • BCP in all the data into both tables
    • create necessary indexes on both these tables.
    • Set up a loop to insert data from new table into existing table using a join to get new id. Only insert in a set range at a time for the old id ie 0-9999, then 10000-20000 etc
    • Print out what id range I've just completed and how many to go.
    • Wait say 5 seconds to let any other waiting processes to complete.
    • Repeat.

    I'd use the loop mechanism to avoid locking out your main table and to avoid filling the logs. You'd want to adjust the range to use each time and the time to wait between batches.

  7. #7
    Join Date
    Jan 2009
    Posts
    4

    thanks

    Thanks for duggestion i will try to impliment logic which u have suggested

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •