Results 1 to 4 of 4

Thread: Massive Inserts

  1. #1
    Join Date
    Dec 2002
    Posts
    45

    Unanswered: Massive Inserts

    Currenlty I have huge amounts of data going into a table.
    I'm sending an xmldoc and using openxml with a cursor to seed them.

    the question I have is whether to let duplicate keyed data rows bounce
    and then check @@error and then do an update on the nokeyed field
    or
    to do a select on the keyed field and then do an insert or update based on the
    selects results.

    Speed is my goal.

  2. #2
    Join Date
    May 2003
    Location
    UK
    Posts
    220
    Howdy


    I'd be careful about using a cursor - especially if you are doing LOTS of inserts - cursors put a LARGE overhead ont he server.

    That said,. I usually find its easier to dump everything into a temp table ( with an ID column on it ) and sort everything there , then do a huge insert.

    This keeps the database from being hammered hard while its doing its thing. Also, the temp tables can be discarded faster than ruinning a cursor.

    Cheers,

  3. #3
    Join Date
    Dec 2002
    Posts
    45

    ok..

    Well with that said. I'm not sure how to get the data into the database anymore efficiently than using a cursor (keep in mind this is not a cursor reading a datatable its reading an xmldocument with open xml). I pass in a large string (XML) wich has about 800 rows that need to be inserted. The cursor loops through the xmldoc and does 800 inserts into a partitioned view. With that said are you recommending I do the 800 inserts into a temp table with the same partitioning & then do a select into from the temp table to the target tables? I was planning on using selects with uncommited reads to negate any locking issues. Let me know if you feel I should still to a temp table first?

  4. #4
    Join Date
    May 2003
    Location
    UK
    Posts
    220
    If you are inserting into a partitioned view thats probably speedy enough, and while cursors usually arent the best solution, if you are looping through 800 rows of data ( and I'm assuming you are just breaking one long XML string into approx 800 rows ) then its probably as good as any.

    However - my preference for temp tables is based around the fact they are usually faster than cursors and it means you can run multiple similtaneous instances of what you are doing using temp tables, rather than multiple similtaneous cursors (yuck ).


    Cheers,

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •