Hopefully one of you gurus can offer a solution to this problem:
First off I do not have the ability to use SQLMail - so that is not an option.
I have a DTS package that performs the following steps in order to populate an excel sheet with some data and email it off to a coworker:
1) FTP Task to copy excel sheet template from one folder to another on same machine
2) Data pump between SQL server connection (local) and copied excel sheet.
3) SQL script task which truncates a table on the sql server
4) activeX script task which contains the following code that uses CDO to attach the excel sheet that was just populated to an outgoing email:
set iMsg = CreateObject("CDO.Message")
Set objMail = CreateObject("CDO.Message")
objMail.From = "email@example.com"
objMail.To = "firstname.lastname@example.org"
objMail.HTMLBody = "Body"
Set objMail = nothing
Main = DTSTaskExecResult_Success
The result of DTS execution is a failure on the activeX step:
Error Description: The process cannot access the file because it is being used by another process.
I assume this means Excel still has its grubby paws on the file - but I've tried alleviating this by (1) using a waitfor delay between the excel connection and the activex script task and even by (2) using the run package task to run a separate package that only has the activex mailsender script in it (thinking the parent package would terminate along with the excel process before running the next package - maybe I'm wrong here?).
Thanks for the reply. I know that it is the file attachment method that is causing the error - I'm just not sure how to resolve it.
I have the default MSDN stored proc for sending mail (I called it CDONTS_SendMailAttach so I would remember that it uses CDONTS). I tried doing the above with that instead and it failed with no real error message. I assume if I could get the error from somewhere it would be the same "file is already in use" kind of thing.
I figured doing it this way was better as the errors are a little more verbose and accessible.
i can guess another reason. is your script trying to attach the file while you data pump routine has not yet completed the export? in DTS things run parallel and to put them in sequence you need to add work-flow links between them. try it if you are not already having one in place.
Thank you for your reply however I do have workflow links between each task, specifically on success ones. It almost seems like they're being ignored. I have another package which does something similar (though less complex) and it works without error.
Another thing I should mention - if I run each task by itself (by using execute step) the package runs fine - it's only when the package runs each step itself that the error occurs. I'm almost positive it has something to do with the excel interop that occurs when you initiate a connection to an excel workbook.