Results 1 to 7 of 7
  1. #1
    Join Date
    Jan 2004
    Posts
    12

    Unanswered: VFP - need to import text file with more than 256 fields

    I wrote a program that uses the low-level file functions to read the text file one line at a time. I then parse the line to get the fields and write them to 2 tables to get around the limitation of 256 fields.

    The program is very slow and I assume it's because I'm only reading in one line at a time. Any suggestions on a more efficient way of handling this?

  2. #2
    Join Date
    Nov 2002
    Posts
    128
    Provided Answers: 1
    Use Excel Automation via VFP to read in the file.

    Yes, Excel has a record limit of 65,536 rows (records) by 256 columns (fields), but if your files can work within those limitations, it might be much better (read as faster) than parsing text in a low-level read.

    When you are finished with the Import of the Text file you Select the individual Columns & Rows for export to your individual DBF's.

    Good Luck,
    JRB-Bldr

  3. #3
    Join Date
    Jan 2004
    Posts
    12
    If Excel has a limit of 256 columns, I fail to see how this would be better than appending the file directly into FoxPro.

    Anyway, the files I'm importing have around a million records.

  4. #4
    Join Date
    Dec 2003
    Posts
    104
    I think you've got about in the most efficient way.
    There are a couple things I am going to suggest that may increase performance, but no guarantees.
    If you are using FREAD(), you may consider FGETS() instead to read an entire line.
    Do you have the tables opened exclusively? If not, try that.
    Don't use any indexes when importing. Build them after the import has finished.
    If you are using a network, move the files to your hard drive if you can.
    DSummZZZ

    Even more Fox stuff at
    www.davesummers.net/foxprolinks.htm

  5. #5
    Join Date
    Jan 2004
    Posts
    12
    I am doing all those things.

    I'm thinking of using fread() and fseek() to read the file in large chunks. I'm hoping to find a simpler solution though.

    The 'Hacker's Guide to VFP' suggests putting the entire file into a memo field and using the memo functions to split out the records. I have no idea how effectively VFP will function with a 1GB memo, but I figure that any solution that makes use of VFP's file buffering will probably be a lot more effective than what I'm currently doing.

  6. #6
    Join Date
    Jan 2004
    Posts
    12
    My testing shows that bringing the file in a memo field results in a huge increase in performance.

    I need to convert all the linefeeds in the memo to carriage returns. I tried 'REPLACE MemoText with STRTRAN(MemoText,CHR(10),CHR(13))'

    I get a 'string too large' error. Any suggestions on how to do this?

  7. #7
    Join Date
    Dec 2003
    Posts
    104
    According to the help file, 'System Capacities':
    "Maximum # of characters per character string or memory variable 16,777,184 "

    Take a look at using _MLINE, MLine(), MemeWidth(), Atline() and so on, for memo fields.
    DSummZZZ

    Even more Fox stuff at
    www.davesummers.net/foxprolinks.htm

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •