Page 1 of 2 12 LastLast
Results 1 to 15 of 19
  1. #1
    Join Date
    Feb 2004
    Posts
    2

    Red face Unanswered: how to import hundreds of CSV files into SQL server?

    Hi,
    I have about 500 CSV files, mostly they are unser same structure. I need to import them all into SQL server. I create a new table for each one based on the firstline as the column name. I can do them one by one but it is very time consuming and boring. Is there a fast way of doing this?

    If someone give me a information, I promise to share all of MY CSV files. They are the zone chart from UPS and I have edit all of them.

    Thanks.

  2. #2
    Join Date
    Jul 2003
    Location
    The Dark Planet
    Posts
    1,401
    Well ... am working on the same myself ... will let you know as soon as i figure it out.
    Get yourself a copy of the The Holy Book

    order has no physical Brett in The meaning of a Kaiser . -database data

  3. #3
    Join Date
    Jul 2003
    Location
    San Antonio, TX
    Posts
    3,662
    Really? Someone here was complaining that it's nearly impossible to get your hands on updated info for this stuff!

    Anyway, you'll have to be logged on with sysadmin fixed security role to do the following:

    declare @cmd varchar(8000)
    create table #files (fname varchar(128) null)
    insert #files exec master.dbo.xp_cmdshell [dir <your_directory> /b]
    delete #files where fname is null

    After this code is run you have a list of all the files that you want to import. If needed, you can incorporate a filter into your DIR command to filter out anything you don't want.

    Now you're ready to build your loop to BULK INSERT each file.

  4. #4
    Join Date
    Jul 2003
    Location
    The Dark Planet
    Posts
    1,401
    Now you're ready to build your loop to BULK INSERT each file.
    Well .. bulk insert requires the table to be present and we both do not have that luxury ... have posted on this before and did not get a satisfactory answer ... so need to think of something else

    http://www.dbforums.com/showthread.php?threadid=980175
    Get yourself a copy of the The Holy Book

    order has no physical Brett in The meaning of a Kaiser . -database data

  5. #5
    Join Date
    Aug 2003
    Location
    Andover, MA
    Posts
    256
    I'll send you a VB app to do this...
    -bpd

  6. #6
    Join Date
    Aug 2003
    Location
    Andover, MA
    Posts
    256
    I was working on a DTS package using a Dynamic Properties tast to pull file names from a table, and update the path of a Text Input file, but getting it to iterate is the kicker....
    -bpd

  7. #7
    Join Date
    Jul 2003
    Location
    The Dark Planet
    Posts
    1,401
    The main problem for me is not to get it to iterate but to get the size of the columns and the column names.

    Excel uploads are better coz you can make a linked server to them.
    Get yourself a copy of the The Holy Book

    order has no physical Brett in The meaning of a Kaiser . -database data

  8. #8
    Join Date
    Aug 2003
    Location
    Andover, MA
    Posts
    256
    With a Text Input file in a DTS package, you can set it up to pull the first row as column names. Sizes are, I believe, automatic, but probably defaults to NVARCHAR(255).
    -bpd

  9. #9
    Join Date
    Aug 2003
    Location
    Andover, MA
    Posts
    256
    So, how would you like to determine the field size?
    -bpd

  10. #10
    Join Date
    Jul 2003
    Location
    San Antonio, TX
    Posts
    3,662
    Well, I BULK INSERT into the same table, into either TEXT or IMAGE field. Actually, I have more than 1 table, and I am dealing with archiving EDI transactions coming from/going to the state. IMAGE datatype I use to store all their PDF's and DOC's with their daily ammendments, and TEXT field for the actual transaction files. I also implemented archiving on the tables with TEXT field, so that the database doesn't have to be a tera-byte size for 2 years worth of transactions to be available.

  11. #11
    Join Date
    Jul 2003
    Location
    The Dark Planet
    Posts
    1,401
    Dynamically based on the data ....

    In case of an excel .. i am using a opendatasource query and then issuing a select * into .... but am not able to do that on csv and flat files
    Get yourself a copy of the The Holy Book

    order has no physical Brett in The meaning of a Kaiser . -database data

  12. #12
    Join Date
    Aug 2003
    Location
    Andover, MA
    Posts
    256
    but if it's a CSV, then the length of a field can change from line to line. I see that working only if it is a fixed-length field file.
    -bpd

  13. #13
    Join Date
    Jul 2003
    Location
    The Dark Planet
    Posts
    1,401
    Thats exactly the problem ... and DTS is not an option for me
    Get yourself a copy of the The Holy Book

    order has no physical Brett in The meaning of a Kaiser . -database data

  14. #14
    Join Date
    Aug 2003
    Location
    Andover, MA
    Posts
    256
    Working on it....
    -bpd

  15. #15
    Join Date
    Aug 2003
    Location
    Andover, MA
    Posts
    256
    Is it okay if the fields default to some larger size (VARCHAR(255)), then you can shrink them manually?
    -bpd

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •