I seem to have a relating issue, but please redirect if I’m not in the right thread…
I am wondering why MySQL would stop taking records at around 1/2 million records when running to input a 1 million record file.
I ran a vb file that loads the database through MyODBC 3.51 as system DSN. This is done at the speed of nearly 50,000 record per/second with Serial ATA RAID 0/1. However, there is a problem that occurs. As the VB exe keeps running to claim it finishes to a million record update, the table in the database (which is MyISAM) shows a COUNT of about and around over 1/2 million records were inputed. This happens repeatedly.
I'm runnung MySQL 4.1 alpha on XP Pro with 2 gigs mem.

Speaking of memory maybe it’s noteworthy that the memory usage steadily increases during record update on the table. When the memory LEVELS from the increasing slope (this is at around where 1/2 million records ALREADY processed) the VB exe keeps going to 1 million in count (EOF).
Am I missing out on something about MySQL or is it the XP? What to do so one can avoid such table lock out if it is one? Is it MyODBC stopped working or XP not adequate enough?