I have many (~70) identical worker systems each running a local instance of SQL Server 2000. Periodically, they want to ship data back to a central repository. The data needs to be cleaned and processed central side and should be deleted from the worker after transfer. Volume is roughly 500 records per hour (24 hours a day)
What is the best way to achieve this?
I would like as little logic and processing on the worker as possible. This way if something needs to be updated, I don't need to go to each machine. And if something goes wrong it is apparent from one central spot.
Would pull based transactional replication be a good approach? I would need to set up publications on all the workers and if an aspect of the publication needed to change, I would need to manually update each worker.
Maybe a DTS package? Do I have a separate copy of the DTS package on the central repository system for every worker system that it needs to pull data from? Is there any way I can have one DTS package and pass the address of the worker as a parameter and automatically schedule the package to be run with all 70 addresses?