Unanswered: What is faster for information retrieval?
Every morning I need to run thirteen queries on a database table through ODBC. I am guessing the size of this database is around 10-15 million rows and approximately 50 columns. Each query is aggregating a year of data by month with various characteristics and seems to take about five or ten seconds to run. I plan on accomplishing the running of these queries through VBA and their results will be stored in an array that will be printed into an Excel Workbook.
My question is this: would it be faster to pull the entire data-set that I need for my report's purposes into a table (approx 150k rows, 10ish columns) and then quickly run the thirteen queries off of this smaller data-set?
If anyone has any advice, comments, or critiques, they would be much appreciated.
Another option if supported by the back end is using stored procedures or pass-through queries to force the processing to happen on the back end. That way only the result set is going across the wire.
This is what I do. We run a massive as400 here and all the data i work with comes from ODBC to the server. I set up a back end DB to with an auto exec macro (that creates make tables for the data i need)to run at apprx 5am in the morning. When the users come in the data is fresh and runs instantly.