Unanswered: Stored Procedure problem in ActiveX Script
First off, apologies if you've read this on any other forums.. I just can't seem to get an answer to the error I encounter..
I have a Stored Procedure being called from an ActiveX script in a DTS Job that runs daily. First the DTS JOb checks the Date value of a remote file, if it is more recent than a file stored locally, it dowloads the file [a semi-colon delimited text file]. Then, from within a While/Wend loop, I am iterating thru each row in the semi-colon delimited text file, formatting the array values of each line, assigning them to Params, executing the SP and setting the command object to Nothing.
The problem is, that after 3151 rows inserted correctly into the Table, the ActiveX script fails and with the following error reported:
[DBNETLIB][ConnectionOpen (Connect()).]SQL Server does not exist or access denied.
Is this a "maximum number of connections" issue? Can it be resolved? I can see that the oCmd [and hence the connection] is trashed and then initialised for each of the 5007 records, but this doesn't seem like a inordinate amount of work really?
I have included an abridged version of my code below.. It is all dim'd elsewhere in the ActiveX so implicit coding notwithstanding..
-- CODE START --
While Not oTxtStream.AtEndOfStream
strLine = Trim(oTxtStream.ReadLine)
i = 0
myArr = Split(strLine, ";")
strParam = fncCharCheck(myArr(i))
Set oCMD = CreateObject("ADODB.Command")
Set oCMD.ActiveConnection = oConn
.CommandType = adCmdStoredProc
.CommandText = "sdnInsert"
.Parameters.Append oCMD.CreateParameter("@Param",adVarChar,adParamInp ut,50,strParam)
Set oCMD = Nothing
-- END CODE --
What I'm after really is a reason for the connection to error on me.. The 5007 line [and up to 650 chars per line] file is too big to be dealt with by the SP directly, right? So I have to use VB to Loop through each line? Is 5007 successive connections alot? Surely this kinda thing is done elsewhere, I just wonder how..
I can't see why the procedure can't use BULK INSERT to get the file into a staging table, and if that succeeds it can do any "proofing" needed, then transform/load the data into the "live" table.
I'm certainly not averse to using DTS when it is needed, and I love it for "one shots", in this case I think the stored procedure can handle everything needed and can do it more efficiently than DTS too.
Thanks PatP, that's great. I've gone with the SQL BULK INSERT method as you suggest; using the saved TXT file as the DataSource with FIELDTERMINATOR = ';' and ROWTERMINATOR = '\n'. Also using KEEPNULLS for null values.
I am having a problem however with the occurence of rogue single quotes in some surname values however [e.g. O'Leary]. Is there a way to REPLACE all occurences of such characters during the BULK INSERT?