Results 1 to 14 of 14
  1. #1
    Join Date
    Jul 2004
    Posts
    306

    Unanswered: Help tuning an incredibly cmplex query

    9.1 fp 4a on AIX 5.3

    Hey guys,

    I'm trying to tune a stupidly complex query (with an explain plan to match). The cost was initially estimated at over 750k timerons and I've managed to get it down to over 400k with some indexes.
    The only tablescan left on a base table is (I think) unavoidable (the column has only 4 different values, so the optimizer not surprisingly doesn't use an index on it)
    I've done a
    runstats on table <tname> with distribution on key columns and detailed indexes all
    for all tables in the query and I'm just looking for any pointers I can get on where to go next. I'm guessing it'd be good to reduce some of the sorts, and temp table scans or the IXSCAN+FETCH ops but I'm not having much luck.

    Is the query itself just plain bad?

    I know it'll eat up a lot of time just reading the db2exfmt output so I appreciate any time you guys take out to give it a looksie

    Cheers
    Attached Files Attached Files

  2. #2
    Join Date
    Dec 2007
    Location
    Richmond, VA
    Posts
    1,328
    Provided Answers: 5
    you will not get many responses with that jumbled mess of a text file try using code tags and put your sql in your post rather than an attachment.

    Dave Nance

  3. #3
    Join Date
    Jul 2004
    Posts
    306
    Quote Originally Posted by dav1mo
    you will not get many responses with that jumbled mess of a text file try using code tags and put your sql in your post rather than an attachment.

    Dave Nance
    Hey Dave,

    It's WAY too big to post in a code tag (many times the char limit). What viewer are you using? It formats just fine for me in Editpad.....

  4. #4
    Join Date
    May 2003
    Location
    USA
    Posts
    5,737
    Did you try optimization level 7?
    M. A. Feldman
    IBM Certified DBA on DB2 for Linux, UNIX, and Windows
    IBM Certified DBA on DB2 for z/OS and OS/390

  5. #5
    Join Date
    Jan 2009
    Location
    Zoetermeer, Holland
    Posts
    746
    Did you feed this query to db2advis?

  6. #6
    Join Date
    Jan 2009
    Location
    Zoetermeer, Holland
    Posts
    746
    why don't you "cut the crap" and loose the clipboard before you expose this to us?
    Code:
    WHERE    inventory.location = loc.location
    AND 1 = 1
    AND Lower(item.cusleadhouse) LIKE '%mirc%'
    AND 1 = 1
    AND 1 = 1

  7. #7
    Join Date
    May 2009
    Posts
    508
    Provided Answers: 1
    meehange, One thing you might look at is to remove all the extra columns that are not needed for the GROUP BY summarization (one obvious example is ' ' as onhand). Once the summarization is processed, then join to any other tables to get the 'display' information. This may include the LOC Nested table columns. The reduction in the row length will help reduce the Sort time as less data is being manipulated.

    PS I would try to refrain from writing join queries in the Select list. This just makes the code hard to follow. Joins should be in the From clause. If you will look at the optimized query, you will see that DB2 rewrote the query to do that.

  8. #8
    Join Date
    Jul 2004
    Posts
    306
    Quote Originally Posted by dr_te_z
    why don't you "cut the crap" and loose the clipboard before you expose this to us?
    Code:
    WHERE    inventory.location = loc.location
    AND 1 = 1
    AND Lower(item.cusleadhouse) LIKE '%mirc%'
    AND 1 = 1
    AND 1 = 1
    Yea thanks for that, this SQL is automatically generated by an out of the box software app... as such I have very little control over the SQL, you didn't seriously think a human being would create a statement that complex and then put that rubbish in there just for effect?!

  9. #9
    Join Date
    Jul 2004
    Posts
    306
    Quote Originally Posted by dr_te_z
    Did you feed this query to db2advis?

    Yup it's got no further suggestions..

    @Marcus_A - I'll give that a shot ... not sure I'll be able to get the app to do it for this individual statement and I'm reluctant to set it database wide... but I'll test to see results in any case... thanks.

  10. #10
    Join Date
    Jul 2004
    Posts
    306
    Quote Originally Posted by Stealth_DBA
    meehange, One thing you might look at is to remove all the extra columns that are not needed for the GROUP BY summarization (one obvious example is ' ' as onhand). Once the summarization is processed, then join to any other tables to get the 'display' information. This may include the LOC Nested table columns. The reduction in the row length will help reduce the Sort time as less data is being manipulated.

    PS I would try to refrain from writing join queries in the Select list. This just makes the code hard to follow. Joins should be in the From clause. If you will look at the optimized query, you will see that DB2 rewrote the query to do that.
    Tell me about it I've got an app that attempts to write SQL so generic that it'll run on almost any platform... as a result it writes horrible code that performs poorly, is nasty to read and has headache inducing 'placeholders' which make you want to retch when you see them. Joy.

    As I said above, I've got very little control over the SQL, in many cases the best I can really hope for is escalating to the product labs and getting them to rewrite the app.

  11. #11
    Join Date
    Jan 2009
    Location
    Zoetermeer, Holland
    Posts
    746
    Quote Originally Posted by meehange
    Yea thanks for that, this SQL is automatically generated by an out of the box software app... as such I have very little control over the SQL, you didn't seriously think a human being would create a statement that complex and then put that rubbish in there just for effect?!
    I found that in your(generated) code. Luckely the optimized code by DB2 itself was free of that crap. So, the optimizing compiler is also capable of deleting nonsence-sql. Nice to see that working.

  12. #12
    Join Date
    Jan 2009
    Location
    Zoetermeer, Holland
    Posts
    746
    Quote Originally Posted by meehange
    very little control over the SQL,
    As a DBA you've always got some control over the SQL by supply-ing userviews only to the applications and never plain tables. So you control the physical layout and your SQL-subroutines are coded in your userview. your code can also be defined in triggers (the new "instead of" triggers are very powerfull you know).

  13. #13
    Join Date
    Jul 2004
    Posts
    306
    Quote Originally Posted by dr_te_z
    As a DBA you've always got some control over the SQL by supply-ing userviews only to the applications and never plain tables. So you control the physical layout and your SQL-subroutines are coded in your userview. your code can also be defined in triggers (the new "instead of" triggers are very powerfull you know).
    I've not looked at those yet, cheers I'll check them out

  14. #14
    Join Date
    Jul 2004
    Posts
    306
    Quote Originally Posted by dr_te_z
    I found that in your(generated) code. Luckely the optimized code by DB2 itself was free of that crap. So, the optimizing compiler is also capable of deleting nonsence-sql. Nice to see that working.
    Unfortunately my hands are even a bit tied in this regard. The app maintains it's own RI inside data tables using XML the schema needs to be in sync with this XML for the app to work properly. Also schema changes can be/are made via the app so it's possible for any changes to invalidate a view, drop an added column when the app does this stuff or if there's an app patch or upgrade etc etc.


    We do plan to move decision support type queries like this awful one to a data warehouse in the future which will give us a lot more control, so at least there's a long term solution.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •