Results 1 to 9 of 9
  1. #1
    Join Date
    Aug 2008
    Posts
    7

    Unanswered: Reports - field source

    I'm trying to improve the performance of a report. In production the database is split with a front end for about 20 users. In development, the database is not split. In dev the report displays immediately. In prod, it takes about a minute to display the report. I would like to improve this time.

    The report acts like a spreadsheet where most cells use a dsum to determine the content. The other cells are just totals of previously displayed data. The dsums all use the same domain, which is a query involving joined tables.

    An example:

    =Nz(DSum("[Current_Estimated_Cost]","[Qry_CC_Current_Costs]","[Activity] = 'Maintenance & Support' AND [Priority_Procurement] = FALSE AND [Discretionary] = FALSE AND [Risk_Managed] = FALSE"),0)

    By rewriting the dsum I was hoping to improve performance.

    According to the help within MS-Access, I should be able to put a SQL statement as the source but I know this is not true, based on web research.

    I tried:

    SELECT nz(sum([Current_Estimated_Cost]),0) as Sum_Cost FROM Qry_CC_Current_Costs

    I also tried to use the Report On Activate event and the Detail On Format event to use a DAO recordset to populate the report field.

    These attempts resulted in the field being blank or the report not running.

    The events can assign a static value like 999 but not when drawn from a query.

    I'm considering writing the entire report as a module, but not sure if it is worth the effort.

    Attached is an example of the report.

    Any suggestions would be appreciated.

    Thanks

    Mike
    Attached Files Attached Files

  2. #2
    Join Date
    Dec 2002
    Location
    Préverenges, Switzerland
    Posts
    3,740
    i would be tempted to take a single big bite at the backend rather than successive nibbles (the nature of a JET backend being that each attempted nibble stuffs all the data down to the client and the client then selects the gems of interest)

    a recordset is one solution, but then you are denied set-based operations and end up with reams of code.

    have you considered building a temp table on the local client for your report.
    currentdb.execute "DELETE * FROM myLocalTemp
    currentdb.execute "INSERT INTO myLocalTemp SELECT this, that, other FROM someJoinQueryYouAlreadyHave WHERE whatever.

    from there, you report local data - likely even your DSUMs will do the job (though "experts" will shun you for using them).

    izy
    currently using SS 2008R2

  3. #3
    Join Date
    Dec 2002
    Location
    Préverenges, Switzerland
    Posts
    3,740
    hmmmm
    thinking some more about this, i guess it is mandatory to bring the data down to the client and report the local data.
    consider your current report taking a minute - 20 users - it is not beyond the realms of possiblity that another user edits something that your report uses halfway into your report generation.
    you end up with:
    q1: 100
    q2: 100
    q3: 100
    q4: 100
    Year: 420
    ...because someone updated q1 halfway thru the report generation.

    izy
    currently using SS 2008R2

  4. #4
    Join Date
    Aug 2008
    Posts
    7
    The temp table may be worht considering. It would need a field for each report cell.

    I'm not worried about data updates as the person who does the update is also the one generating the report. The data is grouped by cost centres with users only allowed to get to their own cost centre data.

  5. #5
    Join Date
    Dec 2002
    Location
    Préverenges, Switzerland
    Posts
    3,740
    Mike - that It would need a field for each report cell seems to be the opposite of what i was trying to suggest.

    just take the raw data down to the client and then throw as many queries/DSums/whatever as you like at the local copy data: it will run as fast as your dev system PLUS the time it takes to clone the data down to the client ONCE.

    throwing multiple Dxxx() at remote data is a performance disaster. looking at your report, it seems that you are using 77 Dxxx() on your remote data - you are fortunate to have a LAN fast enough to complete the report in one minute!

    you say The DSums all use the same domain, which is a query involving joined tables.: just take all° the records from that query and stick them in a local temp table and then mess around with them like you do in dev.

    ° "all" needs trial and error: depending on the size of the dataset, index size/uniqueness, join-type, you might be faster or slower with a WHERE to (e.g.) get only the data for the cost-center of interest down to the client.

    in any case, index the JOINed fields in each source table and index anything in the WHERE clause. depending on the specifics of your tables/data/JOINs, you might end up with a scan anyway so this is more trial and error stuff.

    avoid VBA functions, calculated fields, and concatenation in the clone-to-client query ...do all that stuff at the client on the local data.

    izy
    currently using SS 2008R2

  6. #6
    Join Date
    Aug 2008
    Posts
    7
    Izy, my last reply was a bit premature.

    I am copying just what I need (cost centre specific) into a temp table and then doing the dsums on that data. The temp table is local.

    My report, that used to take a minute over the network, now takes 20 seconds. With a little bit more tweaking, it should improve even more.

    Thanks for the help.

  7. #7
    Join Date
    Nov 2004
    Location
    out on a limb
    Posts
    13,692
    Provided Answers: 59
    avoid the domain functions if you can
    you can do a heck of a lot with unbound controls
    get your data together in a single query
    then place code in the relevant section fo the report format event (detail, footer(s), header(s) or whatever)

    using domain functions can be a serious resource hog... I took over a report that was taking in minutes, by ripping out the domain functions and replacing with code the runtime came down to seconds
    I'd rather be riding on the Tiger 800 or the Norton

  8. #8
    Join Date
    Aug 2008
    Posts
    7
    After maximum tweaking, and still using many dsums, I have the report running over the network, in 4 seconds. This is thanks to the local temp table.

    I'm still curious healdem, how to calculate a sum value and put it into a report field without using dsum. Do I use hidden fields? DAO recordsets would not run in time for the event triggers.

  9. #9
    Join Date
    Nov 2007
    Location
    Adelaide, South Australia
    Posts
    4,049
    I agree with Healdem... dodge using aggregate functions like DSum in a report.

    =Sum(FieldName) in a report header/footer will work fine.
    Owner and Manager of
    CypherBYTE, Microsoft Access Development Specialists.
    Microsoft Access MCP.
    And all around nice guy!


    "Heck it's something understood by accountants ... so it can't be 'that' difficult..." -- Healdem
    "...teach a man to code and he'll be frustrated for life! " -- georgev

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •