Results 1 to 2 of 2

Thread: table update

  1. #1
    Join Date
    Feb 2004
    Posts
    143

    Unanswered: table update

    I have 3 million rows in table_a. I need to update all the 3 million rows for a specific field in chunks of 10,000 rows on each commit. Here's the script that I have, but it seems to have a problem --
    It updates 10001 rows and then hangs. I waited for about 3 hours and still it keeps executing without any further updates. Where could be the problem ?

    declare
    i number := 0;
    BegTime NUMBER;
    EndTime NUMBER;
    ElapsedTime NUMBER;
    sv schema_a.table_a%rowtype;
    cursor c1 is select * from schema_a.table_a ;
    begin
    BegTime := dbms_utility.get_time;
    open c1;
    loop
    fetch c1 into sv;
    exit when c1%NOTFOUND;
    Update schema_a.table_a s
    set s.SV_START_DATE = to_DATE('060101','YYMMDD')
    Where to_Char(SV_START_DATE,'YYMMDD') = '060201';
    i := i+1;
    IF MOD(i,10000) = 0
    THEN
    COMMIT;
    END IF;
    end loop;
    EndTime := dbms_utility.get_time;
    ElapsedTime := (EndTime - BegTime)/100;
    dbms_output.put_line('Total Elapsed time for the procedure to complete' ||
    ElapsedTime || ' seconds');
    commit;
    end;
    /

  2. #2
    Join Date
    Sep 2002
    Location
    UK
    Posts
    5,171
    Provided Answers: 1
    Actually, what your script does is update all rows where to_Char(SV_START_DATE,'YYMMDD') = '060201', 3 million times! No wonder it runs for quite a while...

    Try just running this instead:

    Code:
    begin
    	Update schema_a.table_a s
    	set s.SV_START_DATE = to_DATE('060101','YYMMDD')
    	Where to_Char(SV_START_DATE,'YYMMDD') = '060201';
    	commit;
    end;

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •