Have been following fireant911's progress (and frustrations) with interest and I am sure many members have learnt something new about BB (good and bad).
As we have no support from BB, I was wondering what other limitations we might face. One area that I just experimented with is the number of records in a folder. Tried importing 100,000 records (excel xls format) in to a test database but import keep freezing at 97% - reached this point in only a few seconds. Dropped back to 50,000 and appeared to function ok. Did this 3 more times to build a 200,000 record folder. Run a text find query on this folder and responded quite well, roughly 2-3 seconds for result. Wondering if there are any members using BB with large data sets ( that is 100,000+ records) with success? So far my use is mainly in the 1000 to 10,000 area.
PS. Used "Spawner.exe" a free program to generate test data with Excel to format data.
I have some databases running that would have 30,000 + records and they grow by about 8,000 a year. Their speed is really quick. To keep an eye on the number of records in a folder, I have written a script that runs as a Timer when the database is launched. It quickly counts the number of records in each folder and if it ever finds more than 70,000 it will display a message to the operator telling them to contact me to arrange service as the database needs maintenance.
The problem I find with speed comes down to how complex your form is it is and it seems brilliant database doesn't make use of more than one CPU core.
To get around that I use pretty much the same idea as above. On launch, I have a script that runs and checks how old the records are in each folder.
As the records get more than 2 years old, they are moved from the main folder and stored in archive folders that auto create when needed.
So technically, there's ways around many speed and size problem you might face.