Results 1 to 8 of 8
  1. #1
    Join Date
    May 2004
    Location
    Pune, India
    Posts
    15

    Unanswered: deciding criteria for initial extents and next extents

    hi,

    Based on what should we decide the initial extent and next extent for a particular table or index??

    What are the things that i will have to consider before giving values to these tables /indexes to be created...apart from the number of rows (approximately) to be inserted in the table.

    abhivyakti

  2. #2
    Join Date
    Apr 2003
    Location
    Greenville, SC (USA)
    Posts
    1,155
    You need to have an understanding of the data, how many rows, how the data will be accessed, the avg row size, etc ...

    Here are a couple of "Basic" scripts that you can run to get an approx size of the tables and indexes .... You MUST know the number of expected rows (include growth in these calculations !!!)

    ----------------- For Tables -------------
    set verify off
    undefine pctfree
    undefine avg_row_length
    undefine number_of_columns
    undefine number_of_cols_over_250_bytes
    undefine number_of_rows

    select &&number_of_rows/
    (((100-&&pctfree)*(Value-90))
    /(100*(&&avg_row_length + 3 + &&number_of_columns
    + &&number_of_cols_over_250_bytes))) Blocks_Required
    from V$PARAMETER
    where Name = 'db_block_size';

    ---------------------- For Indexes --------------------

    set verify off
    undefine pctfree
    undefine avg_length_of_indexed_cols
    undefine number_of_indexed_cols
    undefine num_of_ind_cols_over_127_bytes
    undefine number_of_rows

    select &&number_of_rows/(((100-&&pctfree)*(Value-161))
    /(100*(&&avg_length_of_indexed_cols+8+&&number_of_i ndexed_cols
    + &&num_of_ind_cols_over_127_bytes))) Blocks_Required
    from V$PARAMETER
    where Name = 'db_block_size';

    HTH
    Gregg

  3. #3
    Join Date
    May 2004
    Location
    Pune, India
    Posts
    15

    rows statistics

    hi Greg

    Thank u very much for this guideline...

    My problem is to decide the stats based on hte existing one... I have done some exercise on a sample data say 3000 rows.. and have to come up with figures for say 2 lac records(rows - with teh concerend index data, to be calculated differently for tehse rrows) and then a monthly growth of say 20k rows...

    I was not able to decide on waht basis shd i choose. Thoug i have the stats for 3000 rows sample...adn will also be getting another sample stats for 80k rows...

    But i think ur script shd help me getting that.
    thanks again

    regards
    abhivyakti
    pune, india

  4. #4
    Join Date
    May 2004
    Location
    Pune, India
    Posts
    15
    hi Gregg,

    I first thought that the answer u have provided is good enough but now i m again confused as to...

    Even though i know that initially i m going to have some n-lacs rows and then monthly some n-thousand rows, i m still not able to understand some part of ur query above to parameter.

    select &&number_of_rows/(((100-&&pctfree)*(Value-161))
    /(100*(&&avg_length_of_indexed_cols+8+&&number_of_i ndexed_cols
    + &&num_of_ind_cols_over_127_bytes))) Blocks_Required
    from V$PARAMETER
    where Name = 'db_block_size';

    Here why we have to also calculate the num of columns ?

    Can we do something using hte avg row len here - based on available data. in the user_tables after dbms_ddl.analyze_object('TABLE;.......wiht compute option??

    I do appreciate ur reply...

    thanks
    abhivyakti

  5. #5
    Join Date
    Apr 2003
    Location
    Greenville, SC (USA)
    Posts
    1,155
    The single above query is for indexes ... you tell the query how many columns are in the index and how many of them are over 127 bytes. These are the calculations needed for the "header" overhead of the index itself. The same holds true for the other query which calculates the overhead for each row in a table. The AVG_ROW_LEN is the average row lenght of the data. The scripts are sizing the data and the overhead involved also ...

    HTH
    Gregg

  6. #6
    Join Date
    May 2004
    Location
    Pune, India
    Posts
    15
    hi Gregg

    Sorry to bother once again

    But see this query u had given for tables

    select &&numrows / (((100-&&pctfree)*(Value-90)) / (100 * (&&avg_row_len + 3 + &&numcols + &&numcols_over_250_bytes))) Blocks_Required from V$PARAMETER where Name = 'db_block_size';

    Now,
    I understand that numrows and num of columns in a table are important as far as occupying space is concerned...also i understnad hte significance of pctfree..But what i dont understand is
    1) why do u say value-90, that is db_block_size - 90.What is hte significance of reduceing it by 90 bytes.?

    2)why u are adding 3 to it in the second bracket where we are saying avgrowlen + 3 + numcols.

    Pls communicate.
    thanks a lot for earlier response as well.
    regards
    abhivyakti

  7. #7
    Join Date
    Apr 2003
    Location
    Greenville, SC (USA)
    Posts
    1,155
    I got those numbers a long time ago from Oracle ... There are 90 bytes of overhead in each Oracle block .... There are 3 bytes of overhead for each column in a table ....

    HTH
    Gregg

  8. #8
    Join Date
    Sep 2003
    Location
    Virginia, USA
    Posts
    246
    This has always been a lot of work, and most people get close to a good number but always feel it's not good enough. That's why I switched to locally managed tablespaces with auto segment management. Makes my life much simpler, and Oracle seems to do a good enough job. Could I do better? Maybe, but I can also spend the time doing more important things.

    Just an observation,
    MarkRem
    Author, Oracle Database 10g: From Nuts to Soup
    http://www.remidata.com/book_nuts2soup.htm

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •