Results 1 to 5 of 5
  1. #1
    Join Date
    Oct 2003

    Unanswered: Two terrible doubts

    Hi everydoby,

    I am writing to solve two doubts I have:
    Does anybody know if it is possible to create a Materialized View joining 2 tables, but each table is in a different database.
    For instance, I want to create a Materialized view, but the source tables are 'Employee' in Database 'A' and the other table is 'Department' in Database 'B'. Is that possible?
    Is there any way to monitor for a specific period of time all the tables that executed full table scan....
    I mean, in the end of period, I want to see all the tables names that made full table scan, so I will be able to treat each table avoid table scans.
    I hope somebody replies me!

    Thank you all,

  2. #2
    Join Date
    Jan 2003
    Vienna, Austria

    Re: Two terrible doubts

    ad 1)
    I'm not sure, but I think you have to do some kind of replication of the remote table first and create the materialized view on this replicated table.

    ad 2)
    It doesn't look as if this is possible.
    But you might want to do this for performance-reasons?
    So you could use statspack to identify statements with high costs and optimize these.

  3. #3
    Join Date
    Oct 2003
    ad 1)
    I was thinking in using dblinks and then I should reference these dblinks in FROM clause inside the Create Command of Materialized View...
    What do you think about it?
    ad 2)
    Statspack gives statitics of sql statements in general...
    I just want to know wich tables are executing full table scan in a period of time....
    I know that Oracle 9i has a performance view called v$sql_plan.
    It stores the execution plan in memory. But I don't know if it helps me in discovering the tables that perform full table scans, understand?

  4. #4
    Join Date
    Feb 2004

    Re: Two terrible doubts

    OKAY, here is what you want to do to identify full table scans.

    1st, here is some useful SQL:

    --- Access v$session_longops.
    --- Access v$session.
    --- Access v$sqlarea.
    --- Identify the SQL (from the library cache) which
    --- have performed table scans. Note that long SQL
    --- statements will overrun the buffer area.
    --- When that occurs, run the next statement.
    select f.sid, s.username,f.opname,
    TO_CHAR(f.start_time,'HH24:MIS DD month YYYY') "START",
    (sofar/totalwork)*100 "PERCENT_COMPLETE",
    f.sql_hash_value, substr(a.sql_text,1,1000)
    from v$session_longops f, v$session s, v$sqlarea a
    where opname like '%Table Scan%'
    and f.serial#=f.serial#
    and f.sid=s.sid
    and s.AUDSID <>0
    and s.username not in('SYS','SADMIN')
    and f.sql_hash_value = a.hash_value
    order by f.sid, a.sql_text;

    2nd, for any resulting SQL in the substr(a.sql_text,1,1000) column, just run EXPLAIN PLAN on it. You can do both of these in TOAD very easily and quickly. Copy/paste the SQL to the execution area of the screen where you can run EXPLAIN PLAN on it.

    Caveat - if the SQL is > 1000 characters, it will not be complete. You will then need an additional step. Use the hash value identified for the record in the v$sqlarea view and look at it in the v$sqltext view. For example:
    select * from v$sqltext
    where hash_value = 4278400293
    order by piece asc;

    If you do this, you will have to do some editing to the pieces to fit it back together. Not fun, but you can do it.

    The above approach will provide you with all you need to identify tables subject to full table scans.

  5. #5
    Join Date
    Jul 2003
    you can create MVs using dblinks.
    I just don't think you can use the ON COMMIT refresh clause.

    You would need to set the refresh rate instead based on time intervals.
    - The_Duck
    you can lead someone to something but they will never learn anything ...

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts