I have some slow performing stored procs in my database that i need to improve. Before I actually start to change the procs, I need to benchmark the execution times of these procedures. Therefore i want to run the procs a few times and take the average execution time.
But here lies the problem: after running a procedure the first time, a lot of the data it needs is cached. How do i clear out the data cache (not too worried about the procedure cache), so that all of my run times are worst-case scenarios with as much physical I/O as possible?
I guess I could run a query such as select * from a massive table to fill the cache with junk, however a query like this takes time to run, and i want to do a lot of benchmarking.
Any suggestions would be much appreciated.
Just a note, if you run a select * from a massive table, the optimizer will use the MRU policy, and the cache will only change it content after the wash mark. A better way would be to dummy update lots of tuples or as you wrote, use the sa_flush_cache()