Results 1 to 7 of 7
  1. #1
    Join Date
    Aug 2004
    Posts
    6

    Question Unanswered: DTS fails at customer site with "Too many columns", works locally

    I am having the most baffling problem with DTS.... :confused:

    I have a set of ActiveX transforms that execute on my customers flat transaction data files, destination a single database table. Since they switched to a new method of generating the flat file using SAS, the DTS package mysteriously will fail at a couple select records. The error is always the same, and turning on error logging in DTS yielded this:

    Step 'DTSStep_DTSDataPumpTask_1' failed

    Step Error Source: Microsoft Data Transformation Services Flat File Rowset Provider
    Step Error Description:Too many columns found in the current row; non-whitespace characters were found after the last defined column's data.
    Step Error code: 80043013
    Step Error Help File: DTSFFile.hlp
    Step Error Help Context ID:0

    Step Execution Started: 11/16/2004 6:37:51 PM
    Step Execution Completed: 11/16/2004 6:39:39 PM
    Total Step Execution Time: 107.415 seconds
    Progress count in Step: 515000

    The exact same file parses all the way through on my laptop, with the same DTS package. Tests have revealed no strange characters or whitespaces in the data file, not at that record (running a Test... on any of the active x transforms will fail at row 515186 always, until that row is deleted and it fails on some subsequent row - this iteration went on at the customer site until about 5 rows were deleted this month and it finally worked), not at any other records. My database and the customer database are both using the same, default character set.

    The only microsoft KB article referencing anything resembling my problem is
    http://support.microsoft.com/default...b;en-us;292588
    but this does not hold because I am not specifying fixed width, but rather comma delimited.

    If anyone has any ideas about what other environmental variables are coming into play here, please let me know - I'm at the end of my rope. I believe we are both patched up to SQL 2000 SP3. They have an XP client connecting to a 2003 server; I have an XP client/server. Neither machine has the NLS_LANG environment variable set.

  2. #2
    Join Date
    Nov 2002
    Location
    Jersey
    Posts
    10,322
    This may not be helpful...but have you considered just using a stored procedure instead?
    Brett
    8-)

    It's a Great Day for America everybody!

    dbforums Yak CorralRadio 'Rita
    dbForums Member List
    I'm Good Once as I ever was

    The physical order of data in a database has no meaning.

  3. #3
    Join Date
    Nov 2004
    Location
    London, UK
    Posts
    3
    What happens to that row when you try to import the file into access? If you create an extra column at the top of the flat file, it should insert whatevers in that column for the five offending rows right? Once you get it into a table query it with a NOT NULL. It might give you a clue as to what the offending characters are.

    If your stuck with the file then you might just have to use the insertfail phase to make the pump task skip to the next record when it finds an offending row. Read up on multiphase to find out exactly how you'd do this.

    Sorry can't help you more.

  4. #4
    Join Date
    Aug 2002
    Location
    Scotland
    Posts
    1,578
    Modify the DTS package to use an Execute Process Task and then use bcp.

    -or-Use Execute SQL Task and the Bulk Insert Transact-SQL command.
    --Satya SKJ
    Microsoft SQL Server MVP
    [IMG]http://sqlserver-qa.net/google_bart.gif[/IMG]

  5. #5
    Join Date
    Aug 2004
    Posts
    6

    good ideas but....

    ....the file imports fine here with the exact same DTS package, so I don't want to modify it to address a problem that isn't really the problem. IN other words, there is nothing to indicate there is anything actually wrong with the data itself - no whitespaces, no bad characters or problem causing characters, no datatype mismatch, nothing; it looks just like the last row. Here are the rows before and after as well as the one that failed:

    737,10/15/2004,09:11:39,114,15536,1
    737,10/15/2004,09:11:49,114,18408,1
    737,10/15/2004,09:11:54,714,18024,1

    I am not using column 5, but all the others. From last month to this month the number of offending rows increased from 1 to 7, so I don't want to start throwing away data that for all other intensive purposes looks good automatically in case it starts multiplying.

    Since it works here but fails there, it has to be something environmental, maybe with character sets or??

  6. #6
    Join Date
    Nov 2002
    Location
    Jersey
    Posts
    10,322
    Generating files from SAS...Like from a mainframe?

    I betcha you got some low values [CHAR('00') ] going on...

    I know you don't want to alter your process, but I ALWAYS create a staging environment and load the data to it, then audit the data to look for problems...then I move the data in after I verify it...

    And it's all done with a stored procedure
    Brett
    8-)

    It's a Great Day for America everybody!

    dbforums Yak CorralRadio 'Rita
    dbForums Member List
    I'm Good Once as I ever was

    The physical order of data in a database has no meaning.

  7. #7
    Join Date
    Aug 2004
    Posts
    6

    low values

    Thanks for the tip. I am looking into how these "low values" occur and how these EBCDIC to ASCII conversions can get hung up. I'm sure the answer lies somewhere in there.

    Well, the front end application will run a custom DTS package, but not a custom SP. At least the staging need is moot, because it rolls the whole thing back if one record fails...

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •