I've got a VB.net program that takes data from a bunch of different sources (literally 10 different sources in 3 completely different databases), runs some long, ugly calculations and spits out a simple summary of all the data. Right now I'm dumping each summary row into a temp table in Sql Server so that when I finish I can easily upsert the data (update existing rows that have changed and insert rows that do not exist yet) with simple joins.
The problem is that there is a whole lot of data being dumped, and there is a clear CPU hit on the computer containing the Sql Server when this program runs. I suspect that it's because I'm doing tens of thousands of individual inserts into the temp table. Basically I'm doing all the calculations first and then looping through them all and INSERTing them one at a time. Is there a more efficient way to dump so much data? For example, is there a way to maybe write the data to a file instead and then insert everything in the file at once? I don't mind taking a performance hit on the program itself if that's what it takes to reduce the load on Sql Server.
Using a stream to write to a file then BCP data in might be better.
You can also use (I think) the SqlBulkLoad library that uses a stream for an input.
Another option is to use a table valued parameter for a sproc (if this is 2008).
I'm afraid you'd have to suck these and see - I couldn't say now what would be best.