I have a pretty involved query that is based, on a lot a calculations made to a temporary holding table (use DAO recordsets and other calculations using VBA) complete just before querying the temp table, with a bunch of simple If statements returning field values. The user has two options, to view the report that is based off this query, or to view the query directly (for export to excel) The problem I have is at my client locations, when you view the report, everything works great, however, if the client chooses to view th query directly (datasheet view) the fields create from these If statements returns a "#Error" instead of the proper response. If I immediately go into design view and immediately back to datasheet view the query returns the correct response.
I don't have this problem, only at the clients location (multiple locations - still verifying if it happens at all locations). I verified the all library references are there and even went to one of the locations, removed them and added them back to the Database.
I do know that my machine is much faster then the clients location PCs, but don't really believe that CPU speed is the issue, but who knows.
Well, by closing and reopening the query, you are "Requerying" it.
Is the query open while these behind the scenes operations take place? I read "temp table" which implies the data either vanishes or changes during the recalc - which could "upset the query". One thing I've noticed about queries, at least when drawing from ODBC sources, is that if one field fails, it could trigger a chain reaction that causes the whole query to fail (my case was an invalid date coming in to a date field - as soon as that page was reached, the whole query crashes).
Personally, I wouldn't let the client directly access the query. You can setup a form linked to the query, and set the form to "table view". This does lots of little things for you, like improved performance, easy to diagnose and opens up the functionality of a form (like building a filter without going into design view).
Forms export very nicely to Excel - the fields go in the same order as the "tab order" and unlike exporting reports, the data type in Excel will match the data type in Access.
The query is opened after all the proccedures are run. The need to be able to export the data to execl, which is why the have access to it in as a snapshot of the orginal data in the table and the data is pre-filtered by their criteria selection. Really fast and really slow machines don't have any problem with the whole process. it is machines between 1.5 and 2.5 GHz that have problems. One machines gets, were normally an entire column of data errors out, gets half of the data to display properly and half to error out.
I am curious about this improve performance that you talks about, how?
One machines gets, were normally an entire column of data errors out, gets half of the data to display properly and half to error out.
This is very similar to the symptom I had with the ODBC – each “page” of information was displayed until it hit a bad value, and then the ENTIRE column goes, even if I go back (often, the whole query went).
I suggest taking the equations out of the query and putting them in a module as functions and place error traps in the functions. In the query, call the function.
This will improve performance for a reason I’ll state below, but more to the point, I suspect that a “bad value” is coming up from time to time, and in the query, that bad value is “crashing” the whole query.
By doing this, the “bad value” will be caught when the function fails in the code, which will spring up a “debug window” and allow you to see exactly which function failed. Adding a counter will allow you to determine which record caused it to fail. And it doesn’t have to actually be an invalid value – it could be due to the number of milliseconds it takes for the data to get from the server to the client not matching up with when the query expects it’s next piece of information for a calculation (again I refer to my many long hours of dealing with ODBC headaches).
But I end up at Form versus Query; it’s probably more preference than requirement, but the form provides you with a lot more control over the situation, both as a developer and in debug mode. Plus, any code in the form only has to compile once each time the form is run. You have far more control over how it runs (and therefore how it can break) in a form. The form is the object designed for displaying information on the screen. And the form also exports very nicely to Excel (I prefer to use a second form setup specifically for Excel).
The performance aspect of using module based functions in your query has to do with “compiling the code”. When an equation is in a query, Access has to compile the expression and then perform the calculation on every record. This doesn’t become noticeable until your number of records becomes very large though. When using a module level function, the expression is only compiled once and then the query only has to evaluate it for each record. Most important, though is that it moves where the problem manifests itself when there’s an issue. It’s far easier to debug a problem in code, especially when it stops and highlights the specific line of code it didn’t like.
I’m not an expert by any means; but I hope my experiences provide some possibilities for you to explore!
SBaxter, if it means anything, tcace has offered a number of good solutions. Unless you're into programming and are comfortable with error handling, I think his/her suggestion to create a form is the easiest and most effective way to solve your problem -- especially since you can instruct the form to display the data in datasheet view - essentially showing it the same way that it appears in the query.
I think your data displayed will be more stable once it pulls into a form since there won't be any requerying or other behavior unless you do it deliberately.