I currently have a DTS package that takes a text file source and transforms the data into a table. The transformation does a lookup on a product_code column (char(8)) to transform it to the correct product_id (int) for our system.
I've recently setup a view that matches the file layout and has an instead of trigger that inserts into the target table. It does this by joining the inserted table to a translation table to get the correct product_id for the insert. Instead of a DTS package I created a proc that BULK INSERTS the file into the view.
The view approach is running over twice as fast as the DTS package. Is this approach common or has anyone else tried it. Any feedback would be appreciated.
I don't think it is uncommon to see that kind of performance gain. DTS is very good at doing complex things, but it can't compete with BULK INSERT or BCP for doing simple things. The trick is to figure out which tool is best for a given job.
I forgot to mention the file contains 9.9 million records with 7 columns. The DTS package runs in 20 minutes vs. 8 for the view. BCP'ing the file directly into a table with no translation takes 3 minutes.
Try loading a Notes log file into SQL Server doing LDAP lookups for the VPN derived data. DTS can do it nicely using the API, BCP can't get there from here without something that can at least produce a file first, and nothing I've seen will do that very well.
The jobs that BCP can do, it does VERY well. I don't think that anything can be faster than BCP. The jobs that BCP can't do... it can't do.