Results 1 to 3 of 3
  1. #1
    Join Date
    Dec 2003
    Posts
    10

    Wink Unanswered: loading data into columns 75 - N in tables?

    I'm having a very irritating time trying to migrate data from a COBOL system to SQL Server.

    One of the A/R Master files has approx. 200 columns.

    I can export this file any number of ways that will (sometimes) load partially into my database, but always when the load succeeds, columns 75 through N simply contain NULL, even though there is data in the file. When the load fails in DTS, the error is always missing column delimiter. Using BULK INSERT the error is always something like data too long at column 75.

    Putting all this together, I have deduced that something isn't working if I try
    to load a staging table with more than 74 columns of data. This seems like a way-too-low threshold for a robust database, especially since SQL Server is supposed to be able to handle up to 1,024 columns per table.

    Has anyone ever encountered this problem?

    Thanks in advance for any help
    randyvol
    <>< "what we do in life echoes in eternity"

  2. #2
    Join Date
    Nov 2002
    Location
    Jersey
    Posts
    10,322
    What format is the data? Fixed width format

    Are they QSAM Files?

    Did you just ftp down the data or did you create a process...

    I just download the data...usually from DB2

    And create Format cards and bcp the data in

    btw how long is the row?
    Brett
    8-)

    It's a Great Day for America everybody!

    dbforums Yak CorralRadio 'Rita
    dbForums Member List
    I'm Good Once as I ever was

    The physical order of data in a database has no meaning.

  3. #3
    Join Date
    Dec 2003
    Posts
    10

    Talking file description...

    It is a Fixed Width file.
    It is a flat file.
    It is not long, about 1200 bytes.
    Rows end with an 'LF'.
    Columns are delimited with a '|' character.

    It is not rocket science. Should be pretty straightforward, and for about
    1/2 of the files, it is, but the others all exceed 74 columns and that's when
    I get into this situation. There are no hidden characters, I use a bash shell
    script on the Unix system to strip them out before bringing the file over to the Windows system via FTP.

    My assumption is that is the FTP process were adding something to the file, this would occur 100% of the time and not ~ 50% of the time.

    I have developed a work-around in the meantime, which is to create 3 smaller staging tables and then create the master table using a SELECT INTO
    after the staging tables have been populated. But this takes more time than it should.

    I'm really perplexed on this one, have never encountered this before and we've been bringing over some really large files from this Unix system for about 1 year.

    I just ftp the files down and then have a DTS package load the file.
    (I've tried BULK INSERT, but apparently the 'LF' that DTS recognizes as end of row is not the same as the '\n' that BULK INSERT recognizes as end of row because (if the file is short enough) the DTS package works fine, but the BULK INSERT fails complaining about row termination... heavy sigh.

    I've never worked with bcp but at this point I will try anything. Now to show my ignorance, what are format cards? ;-)

    Anything you can suggest is appreciated !!!

    randyvol
    randyvol
    <>< "what we do in life echoes in eternity"

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •