Results 1 to 10 of 10
  1. #1
    Join Date
    Sep 2009
    Posts
    5

    Unanswered: How to decrease time of inserting large data

    Hi all guru's,

    I have a bit problem here..i've done a code that will insert data automatically into a staging table. The problem now is the data that need to insert is quite large (around 25000) and its all different. The time taken for this was so long (around 6 hours)...do any of u could suggest me to use any method that would decrease the time taken.

    A big thankful to all of u.

    Thanks

  2. #2
    Join Date
    Jun 2004
    Location
    Liverpool, NY USA
    Posts
    2,509
    wow 6 hours for a lousy 25000 records you should be able to insert billions in that time. show us the insert code and a few sample records.
    Bill
    You do not need a parachute to skydive. You only need a parachute to skydive twice.

  3. #3
    Join Date
    Jul 2003
    Posts
    2,296
    index and improve your code.
    My guess is you are using a sql query to load the table.
    If that is the case then it is the CODE that is slow and not the LOAD.
    - The_Duck
    you can lead someone to something but they will never learn anything ...

  4. #4
    Join Date
    Aug 2009
    Location
    Olympia, WA
    Posts
    337
    Ahhh, I chocked on my coffee when I read 25000.

    But based on the information you've given, I'd suggest making sure the "Turbo" button on the front of the computer is pressed. If you want more than a half-humorous reference to 486x86 technology, then you'll need to give us some more details.

    How long does it take to load in to the staging table?
    What method are you using to load the staging table?
    What method are you using to transform into target table?
    Are you truncating the target; doing inserts, updates and deletes or only inserts?
    Is your target table indexed?
    Are there triggers on the target table?

  5. #5
    Join Date
    Jan 2004
    Location
    Croatia, Europe
    Posts
    4,094
    Provided Answers: 4
    Ummm ... where did OP actually say that "25000" means "records"?

  6. #6
    Join Date
    Aug 2009
    Location
    Olympia, WA
    Posts
    337
    Well the last time I tried inserting chickens into my database, things went horribly wrong. A week later I tried converting them to CLOBs (Clucking Large Objects) first. But the cellophane melted and they all escaped.

  7. #7
    Join Date
    Aug 2003
    Location
    Where the Surf Meets the Turf @Del Mar, CA
    Posts
    7,776
    Provided Answers: 1
    You can lead some folks to knowledge, but you can not make them think.
    The average person thinks he's above average!
    For most folks, they don't know, what they don't know.
    Good judgement comes from experience. Experience comes from bad judgement.

  8. #8
    Join Date
    Sep 2009
    Posts
    5

    How to decrease time of inserting large data

    Thanks a lot guys for replying, here are the codes..please feel free to analyse it, thanks


    1- i insert data from another table into the staging table
    2- i update the existing data with adding the other data from outsource...

    Like this...

    create table A (
    emplid varchar2(30),
    last_name varchar2(30),
    male_first_name varchar2(30),
    female_first_name varchar2(30),
    unisex_name varchar2(30))
    tablespace ENROLL_DATA01
    pctused 0
    pctfree 10
    initrans 1
    maxtrans 255
    storage (
    INITIAL 16K
    MINEXTENTS 1
    MAXEXTENTS 2147483645
    PCTINCREASE 0
    BUFFER_POOL DEFAULT
    )
    logging
    nocache
    noparallel
    monitoring;



    insert into A (emplid) select distinct(emplid) from ps_job@cbpsprdp;

    create unique index A_ind on A(emplid);
    logging
    tablespace ENROLL_DATA01
    pctused 0
    pctfree 10
    initrans 1
    maxtrans 255
    storage (
    INITIAL 24K
    MINEXTENTS 1
    MAXEXTENTS 2147483645
    PCTINCREASE 0
    BUFFER_POOL DEFAULT
    )
    noparallel;

    commit;

    exec dbms_stats.gather_table_stats
    ('BENAEADM','A',ESTIMATE_PERCENT=>dbms_stats.auto_ sample_size,METHOD_OPT=>'for all columns size auto',CASCADE=>True);



    @table_create.sql;



    exec table_create(1,'AABERG', 'AARON', 'Aaliyah','Abie');
    exec table_create(2,'AABY', 'Aaron', 'AARON','Adair' );
    exec table_create(3,'AADLAND', 'AARON', 'ABBEY','Addison' );
    exec table_create(4,'AAGAARD', 'Abdiel', 'Abbey','Adriel' );
    exec table_create(5,'AAKRE', 'ABDUL', 'Abbie','Africa' );
    exec table_create(6,'AALAND', 'Abdullah', 'ABBIE','Afton' );
    exec table_create(7,'AALBERS', 'ABE', 'Abbigail','Aidan' );
    exec table_create(8,'AALDERINK', 'Abel', 'Abby','Aiken' );
    exec table_create(9,'AALUND', 'ABEL', 'ABBY','Akia' );
    exec table_create(10,'AAMODT', 'Abraham', 'Abigail','Akiva' );
    exec table_create(11,'AAMOT', 'ABRAHAM', 'ABIGAIL','Aldis' );
    .
    .
    .
    exec table_create(25902,'FLEAK', 'Darion', 'Sarai','Storm' );


    P/S : table_create.sql are procedure like this::

    create or replace
    PROCEDURE table_create(num_row number, l_name VARCHAR, m_fname VARCHAR2, f_fname VARCHAR2, u_fname varchar2) IS
    v_row A%ROWTYPE;


    BEGIN


    v_row.last_name:=l_name;
    v_row.male_first_name:=m_fname;
    v_row.female_first_name:=f_fname;
    v_row.unisex_name:=u_fname;


    update A set
    last_name = l_name,
    male_first_name = m_fname,
    female_first_name = f_fname,
    unisex_name = u_fname
    where emplid = (select emplid from
    (select row_number() over (order by emplid) as n, emplid
    from A)
    where n= num_row);

    END table_create;
    /


    Do u hav any suggestion for me to do any changes for upgrade?
    Thanks a lot anyway

  9. #9
    Join Date
    Jan 2004
    Location
    Croatia, Europe
    Posts
    4,094
    Provided Answers: 4
    exec table_create(1,'AABERG', 'AARON', 'Aaliyah','Abie');
    Where do you take these (underlined) values from? Are they in a file? If so, did you consider using it as an external table and do the update in a single SQL statement (without PL/SQL)?

    How do you create "exec table_create" statements, anyway?

  10. #10
    Join Date
    Aug 2009
    Location
    Olympia, WA
    Posts
    337
    Just changing this to a single merge statement will greatly speed things up.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •