Currenlty I have huge amounts of data going into a table.
I'm sending an xmldoc and using openxml with a cursor to seed them.
the question I have is whether to let duplicate keyed data rows bounce
and then check @@error and then do an update on the nokeyed field
to do a select on the keyed field and then do an insert or update based on the
Well with that said. I'm not sure how to get the data into the database anymore efficiently than using a cursor (keep in mind this is not a cursor reading a datatable its reading an xmldocument with open xml). I pass in a large string (XML) wich has about 800 rows that need to be inserted. The cursor loops through the xmldoc and does 800 inserts into a partitioned view. With that said are you recommending I do the 800 inserts into a temp table with the same partitioning & then do a select into from the temp table to the target tables? I was planning on using selects with uncommited reads to negate any locking issues. Let me know if you feel I should still to a temp table first?
If you are inserting into a partitioned view thats probably speedy enough, and while cursors usually arent the best solution, if you are looping through 800 rows of data ( and I'm assuming you are just breaking one long XML string into approx 800 rows ) then its probably as good as any.
However - my preference for temp tables is based around the fact they are usually faster than cursors and it means you can run multiple similtaneous instances of what you are doing using temp tables, rather than multiple similtaneous cursors (yuck ).