Unanswered: Overcoming System Slowdown as Database Increases in Size
I just added another 600 records to my still-developing database. Prior to this addition, I had instantaneous results for all functions appearing as soon as I clicked a button that started the scripting. Now, with the incorporation of the new records, I am seeing a dramatic slowdown in the processing speeds. My computer is a super-fast gaming laptop so it definitely has the required processing power. Depending on which function I chose (the program currently has 12), the time required for the answers to appear varies from instantaneous to approximately 30 seconds. The time variation is certainly tied to the complexity of the scripting used some of the functions are rather complex whereas others as straightforward query-type analyses. I used good programming guidelines and thoughtfully considered how to use the scripting in the most efficient manner. Several of the people here tried assisting me with my recent thread regarding exiting from a 'For Each Record...' loop when certain conditions were met - this is an example of the efficiency I was after (sadly though, I never could successfully exit early when multiple nested 'For Each Record...' was encountered).
There was an earlier thread where people had posted screenshots of their programs. Without seeing these programs in action, it is impossible to determine how quickly the code processed; however, some appeared to certainly be rather detailed programs. I can imagine that time is most certainly a factor since the speed of processing is a much desired attribute from the eyes of the customer therefore I am wanting to make these my processes quicker, if possible. One area that may be causing the slowdown is that I have COMPLETELY removed ALL queries from my program. I replaced most of the queries with a few extra lines of scripting as I quickly grew to not like queries in this program. So, after my rather long introduction to my question, here it is: does the usage of queries, rather than scripting, greatly affect the processing time of the program or does it even matter?
Any script or query that is designed to check all records in a folder will get slower as the amount of records gets bigger. If my database is one that does need to check records in the manner we are talking about, what I do is I design it in a way that it doesn't have to check all the records.
One way I do this is that, I would write an auto archive function.
Say for example the folder where my main records were stored was called “Production”. Every year the program would create a new folder called “Production YEAR”. The YEAR part is always today’s year minus 2 for example. So in this case the program will create a folder called “Production 2011”. It would then move all the records that were created in 2011 out of the Production folder into the “Production 2011” folder.
So the main production folder would only ever 2 years records, and each year, a year of records would be archived into the relevant folders.
This is a very good way to keep your database fast too as switching from folder to folder gets slower too as the number of records increase. Spreading them out as I do solve this problem and even the program launches much quicker when you have lots of records.
There are other options too. Like simply getting the end user to select how far back they want to go and so on...
That is an interesting approach. Unfortunately, in my case, I cannot segregate the data into folders by time intervals due to the nature of the analyses; however, I can do some other dividing of the raw data that should break up the volume of data like you were recommending - this is a method that I had not considered. With me still being new to Brilliant Database, my knowledge is rather limited regarding this software. I certainly appreciate your sharing of the wisdom you have attained with this software. I would have already abandoned BD if you had not taken the time to help me as you have. Thanks for all of your help!
The biggest problem in the BD is its slugginess particularly when number of records in the many to many field is great. You can see this delay considerably while you navigate from one folder to another where you have many to many field and it contains more records.Same thing happens when we have more image fields in one form.