Unanswered: DTS fails at customer site with "Too many columns", works locally
I am having the most baffling problem with DTS.... :confused:
I have a set of ActiveX transforms that execute on my customers flat transaction data files, destination a single database table. Since they switched to a new method of generating the flat file using SAS, the DTS package mysteriously will fail at a couple select records. The error is always the same, and turning on error logging in DTS yielded this:
Step 'DTSStep_DTSDataPumpTask_1' failed
Step Error Source: Microsoft Data Transformation Services Flat File Rowset Provider
Step Error Description:Too many columns found in the current row; non-whitespace characters were found after the last defined column's data.
Step Error code: 80043013
Step Error Help File: DTSFFile.hlp
Step Error Help Context ID:0
The exact same file parses all the way through on my laptop, with the same DTS package. Tests have revealed no strange characters or whitespaces in the data file, not at that record (running a Test... on any of the active x transforms will fail at row 515186 always, until that row is deleted and it fails on some subsequent row - this iteration went on at the customer site until about 5 rows were deleted this month and it finally worked), not at any other records. My database and the customer database are both using the same, default character set.
If anyone has any ideas about what other environmental variables are coming into play here, please let me know - I'm at the end of my rope. I believe we are both patched up to SQL 2000 SP3. They have an XP client connecting to a 2003 server; I have an XP client/server. Neither machine has the NLS_LANG environment variable set.
What happens to that row when you try to import the file into access? If you create an extra column at the top of the flat file, it should insert whatevers in that column for the five offending rows right? Once you get it into a table query it with a NOT NULL. It might give you a clue as to what the offending characters are.
If your stuck with the file then you might just have to use the insertfail phase to make the pump task skip to the next record when it finds an offending row. Read up on multiphase to find out exactly how you'd do this.
....the file imports fine here with the exact same DTS package, so I don't want to modify it to address a problem that isn't really the problem. IN other words, there is nothing to indicate there is anything actually wrong with the data itself - no whitespaces, no bad characters or problem causing characters, no datatype mismatch, nothing; it looks just like the last row. Here are the rows before and after as well as the one that failed:
I am not using column 5, but all the others. From last month to this month the number of offending rows increased from 1 to 7, so I don't want to start throwing away data that for all other intensive purposes looks good automatically in case it starts multiplying.
Since it works here but fails there, it has to be something environmental, maybe with character sets or??