Results 1 to 10 of 10
  1. #1
    Join Date
    Oct 2009
    Posts
    11

    Unanswered: .fmt and Bulk Insert importing Bad Data

    I am importing a .fmt for a bulk insert, but, it is causing the data to be incorrect. here is the .fmt file:

    8.0
    12
    1 SQLCHAR 0 255 "\t" 1 Allied ""
    2 SQLCHAR 0 255 "\t" 2 Mfr ""
    3 SQLCHAR 0 255 "\t" 3 Mfr ""
    4 SQLCHAR 0 255 "\t" 4 Product ""
    5 SQLINT 0 4 "\t" 5 Price ""
    6 SQLCHAR 0 255 "\t" 6 UOM ""
    7 SQLINT 0 4 "\t" 7 Lead ""
    8 SQLINT 0 4 "\t" 8 Sells ""
    9 SQLINT 0 4 "\t" 9 Available ""
    10 SQLINT 0 4 "\t" 10 Min ""
    11 SQLINT 0 4 "\t" 11 Catalog ""
    12 SQLCHAR 0 255 "\r\n" 12 URL ""

    this is coming from a tab deliminated text file. I read somewhere that everything should be SQLCHAR for text files; however, when I change them, then I get an invalid

    When I run it as SQLINT, I get no erros; however, I get bad data. Example:

    What should be data of 1, results on the import of 153104672.

    Any ideas?

  2. #2
    Join Date
    Nov 2002
    Location
    Jersey
    Posts
    10,322
    well if it's delimited you don't need a format file

    try loading it with bcp...
    Brett
    8-)

    It's a Great Day for America everybody!

    dbforums Yak CorralRadio 'Rita
    dbForums Member List
    I'm Good Once as I ever was

    The physical order of data in a database has no meaning.

  3. #3
    Join Date
    May 2005
    Location
    South Africa
    Posts
    1,365
    Provided Answers: 1
    SQLCHAR and SQLINT describe your file data type
    If your file data is plain (readable) text then you definitely need SQLCHAR
    With a tab delimited char file you should be able to use bcp -c
    i.e. no format file needed
    post your table ddl and some sample data if you need more help

  4. #4
    Join Date
    Oct 2009
    Posts
    11
    I need the .fmt file as I am not importing every column in the table. Here are my field types:

    1: varchar 255
    2: varchar 255
    3: varchar 255
    4: varchar 255
    5: float 8
    6: varchar 255
    7: int 4
    8: int 4
    9: int 4
    10: int 4
    11: varchar 255
    12: int 4

  5. #5
    Join Date
    Nov 2002
    Location
    Jersey
    Posts
    10,322
    But you are importing every field in the file...no?

    bcp it into a staging table, then move the data to the table

    BUT...the format card should match the table...and have zeroes where there is no data...If I remember correctly...
    Brett
    8-)

    It's a Great Day for America everybody!

    dbforums Yak CorralRadio 'Rita
    dbForums Member List
    I'm Good Once as I ever was

    The physical order of data in a database has no meaning.

  6. #6
    Join Date
    Oct 2009
    Posts
    11
    The importing file will have roughly 35k records, so I'd rather not import them into a temp table, then push them to another table.

    I am importing every field in the file, but not to every field in the table. When i tried doing a straight import in my stored procedure, it crapped out on the field that didnt' have any data imported into it.

  7. #7
    Join Date
    Nov 2004
    Posts
    1,427
    Provided Answers: 4
    The importing file will have roughly 35k records, so I'd rather not import them into a temp table, then push them to another table.
    Why not? Why make life hard on yourself?

    Import all columns in a staging table (or temp table) then select the columns you want and INSERT/UPDATE them into the appropriate table(s).

    You are loosing time trying to find a better/smarter way of doing it. Working with a staging table is standard procedure, best practice. What is holding you back? Not those lousy 35K I hope. Drop the staging table once you're finished.
    With kind regards . . . . . SQL Server 2000/2005/2012
    Wim

    Grabel's Law: 2 is not equal to 3 -- not even for very large values of 2.
    Pat Phelan's Law: 2 very definitely CAN equal 3 -- in at least two programming languages

  8. #8
    Join Date
    Oct 2009
    Posts
    11
    The whole point of this process is to increase the speed of the import.

    The process used to be in Access with an Access table...I moved it over to sql, and, now, the import takes like 10 minutes where it used to take 3.

    I can test the speed using that process, as the insert from one table to another in SQL probably won't take long.

  9. #9
    Join Date
    May 2005
    Location
    South Africa
    Posts
    1,365
    Provided Answers: 1
    You don't need a staging table
    And you don't need to import all columns if you don't want to.
    If it is a repeated process, rather spend a bit more time to make it more efficient
    Without your create table statement (ddl) and some sample data I can not help you.

  10. #10
    Join Date
    Oct 2009
    Posts
    11
    Here is the stored procedure:

    CREATE PROCEDURE AlliedInventoryDelete AS

    TRUNCATE TABLE AlliedInventoryMaster

    BULK
    INSERT AlliedInventorymaster
    FROM '\\cdrexch1\shared\ScannedImages\dee-elect-part-file.txt'
    WITH
    (
    FORMATFILE = '\\cdrexch1\shared\ScannedImages\AlliedInventoryMa ster.fmt'
    )
    GO

    And it will be ran a couple times a week.

    Here is a line of data...it is a tab deliminated file. I will be importing every field; however, there is one extra field in the table being imported to that doesnt' exist on the .txt file. First line is headers:

    Allied Stk # Mfr Name Mfr Part # Product Desc Price UOM and Qty Lead Days Sells in mults of Available Stock Min Qty Catalog Page URL
    201-0001 CARLTON BATES LCOMP 3RV1021-1CA10 187.390 EA/1 21 1 4 1 0 http://www.alliedelec.com/Search/Sea...hQuery=2010001

    Looks like crap here, but you can get the idea.

    For this line, the Min Qty field is 1, when imported using the .fmt file, it shows as 153104672.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •