Results 1 to 3 of 3
  1. #1
    Join Date
    Sep 2003
    Posts
    1

    Red face Unanswered: Massive Data processing - Efficient method please

    Need your expertise to avoid heavy rollback. Also I dont want to impact online users.
    We have a table with more than 10Million rows. Need to scan through all the rows to find the qualified records on certain criteria every month. I will not be updating this table but writing the qualified IDs into different table to process further down the line. Except Primary Key (Num 8) I dont have any other fields to separate the data in to groups. The processor can take 6-10 hours.( Rough guess). What is the best way to read all records and process them. One way (May not be efficient) I can think of is to break the PK IDs into various ranges and have another create table created with ranges. Then process each range one after another so that you will not be holding all 10M records in ur cursor.

    Is this the only way or any other efficient way we can handle it?

    Thanks alot in advance and I appreciate your help.

  2. #2
    Join Date
    Mar 2002
    Location
    Reading, UK
    Posts
    1,137
    Use partitioning to split the table up. You may need to experiment to find if range partitioning is best or hash partitioning. If you can create an index according to the criteria you are going to use to select the records then this would help a lot aswell. You may want to look into bitmap indexes and table compression (9.2 feature) which might be useful.

    There are so many options now I would suggest the best thing is to look into the concepts manual to get a good overall idea of all the options available and then try them out on your test environment to find the best one.

    Alan

  3. #3
    Join Date
    Jul 2003
    Posts
    2,296
    Along the lines of what Alan said, look into creating a temporary table with partitioning according to range. First make just one partition on that table and I would suggest nologging.

    From there it is simple to copy all data with the 'exchange partition' command (look that one up, very easy).

    Then split your partitions however you like.

    here is an example below of what I have just described:
    PHP Code:

    CREATE TABLE METER_INTERVAL
    (
      
    ORG_ID         VARCHAR2(30NOT NULL,
      
    DEVICE_ID      VARCHAR2(30NOT NULL,
      
    END_DT         DATE         NOT NULL,
      
    RDNG_NBR       NUMBER(8,2),
      
    STAT_CD        VARCHAR2(30),
      
    USAGE_NBR      NUMBER(6,2),
      
    AGG_DT         DATE,
      
    AGG_HR_NBR     NUMBER(2),
      
    AGG_INTVL_NBR  NUMBER(2),
      
    ACT_RDNG_DT    DATE,
      
    ACT_RDNG_NBR   NUMBER(8,2)
    )
    compress
    partition by range 
    (end_dt)
        (
    PARTITION D2003_09_12 values less than ('12-SEP-2003'tablespace PLAT_TAB_02)
    /

    alter table PART_METER_INTERVAL 
    exchange partition D2003_09_12 with table METER_INTERVAL
    /


    ALTER TABLE PART_METER_INTERVAL SPLIT PARTITION D2003_09_12
       AT 
    ('01-SEP-2003')
       
    INTO PARTITION D2003_09_01
              TABLESPACE indx
    partition D2003_09_12);

    ALTER TABLE PART_METER_INTERVAL SPLIT PARTITION D2003_09_12
       AT 
    ('02-SEP-2003')
       
    INTO PARTITION D2003_09_02
              TABLESPACE indx
    partition D2003_09_12PARALLEL 5;

    ALTER TABLE PART_METER_INTERVAL SPLIT PARTITION D2003_09_12
       AT 
    ('03-SEP-2003')
       
    INTO PARTITION D2003_09_03
              TABLESPACE indx
    partition D2003_09_12)   PARALLEL 5
    etc. etc.
    - The_Duck
    you can lead someone to something but they will never learn anything ...

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •