Page 1 of 2 12 LastLast
Results 1 to 15 of 21
  1. #1
    Join Date
    Jan 2012
    Posts
    37

    Unanswered: Importing from a folder

    Hi guys I hope you can help

    This question was covered previously but not for a complete noob like me

    I want to import thousands of records into BD but it comes to a complete halt in the middle and takes hours when I send the entire file.

    My plan is to split the big CSV into smaller parts so the import goes quicker.

    Is there a way to tell BD to watch a folder and process the 200 odd CSV file's.

    Currently it takes almost 8 hours for my CSV and dies, it seems BD xeports the files somewhere as a temp file and when done imports into the database, I can only assume this because the file size of the database does not get bigger during the import.

    Thanks in advance

    Pierre

  2. #2
    Join Date
    Feb 2004
    Location
    In front of the computer
    Posts
    15,579
    Provided Answers: 54
    Can you "divide and conquer" your CSV import file?

    Break your CSV into small pieces... You'll have to decide what small means to you, something that makes sense. Copy each piece into a file with a constant name such as C:\BDImport.csv and write a BD script to import that one file. Rinse and repeat as often as necessary to process your import file in pieces small enough for BD to comfortably digest.

    -PatP
    In theory, theory and practice are identical. In practice, theory and practice are unrelated.

  3. #3
    Join Date
    Jan 2012
    Posts
    37
    Quote Originally Posted by Pat Phelan View Post
    Can you "divide and conquer" your CSV import file?

    Break your CSV into small pieces... You'll have to decide what small means to you, something that makes sense. Copy each piece into a file with a constant name such as C:\BDImport.csv and write a BD script to import that one file. Rinse and repeat as often as necessary to process your import file in pieces small enough for BD to comfortably digest.

    -PatP
    It works - superfast

    For those that want to know how to split the CSV get this download, works like a charm

    FXFishermans Splitter

  4. #4
    Join Date
    Feb 2004
    Location
    In front of the computer
    Posts
    15,579
    Provided Answers: 54
    I'd just cheat and use a PowerShell, AWK, or Perl script... I think this is one case where everyone has tools that they prefer, and as long as your tools work for you then that's all that matters!

    -PatP
    In theory, theory and practice are identical. In practice, theory and practice are unrelated.

  5. #5
    Join Date
    Sep 2011
    Location
    Australia
    Posts
    264
    Provided Answers: 2
    Have you tried opening in Excel (or clone) and saving as XLS file, then import to in to BD? I have used this with great success on up to 50,000 txt records.

  6. #6
    Join Date
    Jan 2012
    Posts
    37

    Import Issues

    Quote Originally Posted by tamcind View Post
    Have you tried opening in Excel (or clone) and saving as XLS file, then import to in to BD? I have used this with great success on up to 50,000 txt records.
    I have +400 000 records and BD v9 can only import excel 2003 which limits me to 6500 records at a rtime, how did you manage to import a XLS file

  7. #7
    Join Date
    Jan 2012
    Posts
    37

    Xls

    Quote Originally Posted by tamcind View Post
    Have you tried opening in Excel (or clone) and saving as XLS file, then import to in to BD? I have used this with great success on up to 50,000 txt records.
    I would love to import the XLS file as I have migrated this data from Access so I do not have to reformat the data to export right in BD as a CSV

    Please share your skills with regards to this

    BD is 100000 more adaptable than access,filemaker or even alpha five - importing sucks mega though

  8. #8
    Join Date
    Jan 2012
    Posts
    37

    Update

    It's taking nearly 40 - 50 minutes per CSV file

    There are 41 CSV files and it's killing me - I wish the developer "Michail"? would wake up and make BD the product it could should have been "ACCESS KILLER"

    As a comparison Filemaker imports the same 40 000 records in about a minute.

  9. #9
    Join Date
    Sep 2011
    Location
    Australia
    Posts
    264
    Provided Answers: 2
    Hello
    I have not done a import for a few weeks but just to comfirm did a quick test with 50,000 lines (rows) created with excel 2000 (excel limit is 64,000 - not sure where the 6500 limit comes from). To my disbelief importing was very slow, then tried access mbd version of same file and still slow. Now this was not the case last month as I did a 200,000 record test (in 4 x 50,000 bites) and remember watching the progress bar steadly moving to right side - not the grass growing speed tonight ( have installed V10 but not using for this test).
    Closed down desktop and tried on my little netbook running 9.42 - quick test with 10,000 xls file and now all done in 30 seconds.
    Something odd here - will do more tests to find what is the cause - try to report back tomorrow
    Always found csv txt import unusable on anything over 1000 lines.

  10. #10
    Join Date
    Jan 2012
    Posts
    37

    Still at it

    Quote Originally Posted by tamcind View Post
    Hello
    I have not done a import for a few weeks but just to comfirm did a quick test with 50,000 lines (rows) created with excel 2000 (excel limit is 64,000 - not sure where the 6500 limit comes from). To my disbelief importing was very slow, then tried access mbd version of same file and still slow. Now this was not the case last month as I did a 200,000 record test (in 4 x 50,000 bites) and remember watching the progress bar steadly moving to right side - not the grass growing speed tonight ( have installed V10 but not using for this test).
    Closed down desktop and tried on my little netbook running 9.42 - quick test with 10,000 xls file and now all done in 30 seconds.
    Something odd here - will do more tests to find what is the cause - try to report back tomorrow
    Always found csv txt import unusable on anything over 1000 lines.
    This is so sad

    Since my original post this DB is still importing - I am at 120 000 odd records

    What you mention is quite interesting - I have heard that some apps do not like multi core pc's and you can drop the processor count on the application.

    I tried this but in vein, slow as molasses

    I am importing 10 000 records at a time and there is 45 of them - yeeehaawwww

  11. #11
    Join Date
    Apr 2013
    Posts
    226
    Any possibility v10 has replaced a shared file?

  12. #12
    Join Date
    Sep 2011
    Location
    Australia
    Posts
    264
    Provided Answers: 2
    Just tested on netbook while going to work with 50,000 records (fname, lname, county, email) and done in about 3 mins. Will try with more fields tonight. and test on desktop again.
    Excel used on netbook is 2007 but assume format is same (save in same type (2000).
    Regards
    David

  13. #13
    Join Date
    Apr 2013
    Posts
    226
    There are different builds of 9.42. I've seen 5502 and 5509.

  14. #14
    Join Date
    Jan 2012
    Posts
    37
    I am using a Core I5 pc with all the goodies, it will be shame if BD is actually faster on a smaller PC

    Attached is an example of the CSV, its's not very complex.

    I left it on the entire night and out of 480 000 it only imported about 55 000
    Attached Files Attached Files

  15. #15
    Join Date
    Sep 2011
    Location
    Australia
    Posts
    264
    Provided Answers: 2
    Lastest result on netbook (1.6g single core /2 gig ram - pretty basic) is 5 minutes to import 65,000 with 12 fields in to a database with 100,000 existing records (no optimization done yet). The interesting thing is that the record counter in top rhs does not update value untill import is finished and the progress bar is fairly consistent in percentage steps. In other words seems to read a xls line and immediately write to database with out any refreshing (thats good). While when in slow mode on desktop, counter rolls over every few seconds - is it reading , writing AND REFRESHING the whole database after each write ? This could explain very slow operation which only gets worst as record numbers increase. Anyway will now play with desktop in next 24 hrs and try what ever I can think of to match netbook - should be twice as fast logicaly and was similar last month. I am sure the developer could answer this but still waiting on reply to V10 problems (3 days and counting).
    Regards
    David

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •