Hello - this is my first post (just found the site today). I'm looking for advice regarding the following design issue(s):
I have a VB batch job that runs a couple of times throughout the day - each time it runs, it inserts approximately 100k rows to a table (call it Table A). I need to be able to view the inserted data by batch run and in aggregate. Obviously querying the table has now become too expensive to produce aggregate reports.
I'd like to create an aggregate table (call it Table B) so that as rows are inserted into Table A, Table B is updated to reflect running totals.
Currently, the batch job inserts records into Table A by using the ADO batchupdate method. If I were to stick with this, then I could create a trigger to update Table B as records are inserted into Table A.
Question 1: Is it posssible to create a trigger that will update if the record exists or insert if it doesn't? What kind of overhead does that represent?
Question 2: If instead of using the updatebatch method, I call a stored procedure to insert each of the records into Table A and also insert/update Table B, will I suffer a performance hit?
Sorry for the long post, just wanted to make sure I didn't leave anything out. TIA for any help you can offer!
1) You can create a trigger on Table A that updates/inserts into TableB.
SET value = value+i.Increment
FROM inserted i,
WHERE b.Key = i.Key
INSERT TableB (col1,col2,col3,...)
FROM inserted i
WHERE NOT EXISTS
FROM TableB b
WHERE b.Key = i.Key
This is just an idea on how you can do it. You could simplify the INSERT part if only one record is in the INSERTED table. You may need to think about what should happen to the values in TableB if TableA is updated?
2) Stored procedure are allows the way to go and increase performance, however when it comes to bulk operations or single operation with stored procedure, I like bulk. If you could combine the 2 that would be great.