I am having a text file sent from another location on a daily basis. Suppposely @ 2:00 am. I created a DTS to bring the data into a local table (5:00AM). The data I am bringing in replaces the old data. So what happens in the DTS package is all data is dropped from the table and then the new data is inserted.
I then have another job run that runs later in the day before the next incoming data arrives (7:00PM). This job deletes the old text file so that it is not appended the next time the text file is sent to me.
The problem is that for the last two days, the server sending the text file did not send the files before my local jobs run (up to 9:00AM and 7:00AM). Thus my 7:00PM job has deleted the old text file. Then later my job that calls the DTS runs (5:00AM); it then drops all data and then tries to load new data that is not there because my 7:00PM job deleted the text file and the their 2:00AM has not delevered the new text file.
My question is; what is the best way to script a job that checks to see if the text file exist before dropping the existing table?
I know that the whole process could be handled better if both the supplier of the data and me the end user could be more flexible. The problem is that the sender is going to send the data in the manner (time and method) that requires the least amount of work for them. I just need to deal with it.
Set MyFile = CreateObject("Scripting.FileSystemObject")
If MyFile.FileExists(local_server_path) Then
Main = DTSTaskExecResult_Success
Main = DTSTaskExecResult_Failure
'local_server_path' is your file path on the server.
Use the 'onSuccess' event to start the rest of your DTS.