Results 1 to 3 of 3
  1. #1
    Join Date
    Dec 2007

    Question Unanswered: More Efficient Inserting of Bulk Data

    I've got a program that takes data from a bunch of different sources (literally 10 different sources in 3 completely different databases), runs some long, ugly calculations and spits out a simple summary of all the data. Right now I'm dumping each summary row into a temp table in Sql Server so that when I finish I can easily upsert the data (update existing rows that have changed and insert rows that do not exist yet) with simple joins.

    The problem is that there is a whole lot of data being dumped, and there is a clear CPU hit on the computer containing the Sql Server when this program runs. I suspect that it's because I'm doing tens of thousands of individual inserts into the temp table. Basically I'm doing all the calculations first and then looping through them all and INSERTing them one at a time. Is there a more efficient way to dump so much data? For example, is there a way to maybe write the data to a file instead and then insert everything in the file at once? I don't mind taking a performance hit on the program itself if that's what it takes to reduce the load on Sql Server.

    Thanks for any help that you can offer.

  2. #2
    Join Date
    Feb 2004
    One Flump in One Place
    Using a stream to write to a file then BCP data in might be better.
    You can also use (I think) the SqlBulkLoad library that uses a stream for an input.
    Another option is to use a table valued parameter for a sproc (if this is 2008).

    I'm afraid you'd have to suck these and see - I couldn't say now what would be best.
    pootle flump
    ur codings are working excelent.

  3. #3
    Join Date
    Feb 2005
    Sorry I am a newb to MS technologies, but can you output to a file and 'BULK INSERT' into the table?

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts