Results 1 to 5 of 5
  1. #1
    Join Date
    Mar 2009
    Posts
    24

    Unanswered: Select returns 1 million records,how to process each record optimally in Stored proc

    HI I am running this Stored proc succesfully, but i have a question here:the first select stmnt fetches say 1 million records, then what is the ideal way to process such cases. Because for each record i m performing some operation. it look ok if the select query return some thousand records, then process doesnt take much time, but if the records fetched are in million then how show we process them. Say if this query
    SELECT USER_ID, USER_PWD FROM DB2ADMIN.LOGIN returns 1 million records, then?
    Can a cursor hold 1 million records, can i process them in batches in 1 sp only say first 10 thousands first then subsequently?
    Whats the optimal way to do this.


    Here is a sample SP which illustrate my case

    CREATE PROCEDURE PRAGS.TESTSP ( )
    ------------------------------------------------------------------------
    -- SQL Stored Procedure
    ------------------------------------------------------------------------
    P1: BEGIN
    -- Declare variable
    DECLARE ENDTABLE INT DEFAULT 0;

    DECLARE uid VARCHAR(10);
    DECLARE pwd VARCHAR(10);

    -- Declare cursor
    DECLARE cursor1 CURSOR FOR
    SELECT USER_ID, USER_PWD FROM DB2ADMIN.LOGIN;

    -- Declare handler
    DECLARE CONTINUE HANDLER FOR NOT FOUND
    SET ENDTABLE = 1;

    OPEN cursor1;

    SET ENDTABLE = 0;
    FETCH FROM cursor1 INTO uid, pwd;
    WHILE ENDTABLE = 0 DO

    IF uid = 'Prashant' THEN
    UPDATE DB2ADMIN.LOGIN SET USER_PWD = 'Aggarwal' WHERE USER_ID = uid;
    END IF;

    FETCH FROM cursor1 INTO uid, pwd;
    END WHILE;
    CLOSE cursor1;
    END P1

    Please give some feasible options
    Thanks
    Prashant

  2. #2
    Join Date
    May 2007
    Location
    somewhere in dbforums
    Posts
    221
    Dude whaasssuppp????

    YOu are querying DB2ADMIN.LOGIN and then updating the same table for some condition after checking each row..... Dont we have an UPDATE sql statement just for that.....

    Why dont you directly give the following update command
    UPDATE DB2ADMIN.LOGIN SET USER_PWD = 'Aggarwal' WHERE USER_ID = 'Prashant'
    This statement is equivalent to your SP....


    If I am not wrong the example you have used is incorrect..... would be better if you explain what you want to achieve correctly...

    On side note would like you to have a look at GLOBAL TEMPORARY TABLES...
    Last edited by nick.ncs; 03-28-09 at 12:29.
    IBM Certified Database Associate, DB2 9 for LUW

  3. #3
    Join Date
    Mar 2009
    Posts
    24
    Quote Originally Posted by nick.ncs
    Dude whaasssuppp????

    YOu are querying DB2ADMIN.LOGIN and then updating the same table for some condition after checking each row..... Dont we have an UPDATE sql statement just for that.....

    Why dont you directly give the following update command

    This statement is equivalent to your SP....


    If I am not wrong the example you have used is incorrect..... would be better if you explain what you want to achieve correctly...

    On side note would like you to have a look at GLOBAL TEMPORARY TABLES...
    Actually the SP was just an example for the scenario i mentioned,

    Actual SP is complex to write through a single statement. My question if we get large number of records in select stmnt and we have to process them one by one, how can we tackle this optimally, shall we fetch all records in one go in the cursor, say i have 1 million records, cant we process them in batches but all should be processed in one sp

  4. #4
    Join Date
    May 2003
    Location
    USA
    Posts
    5,737
    Quote Originally Posted by cprash.aggarwal
    Actually the SP was just an example for the scenario i mentioned,

    Actual SP is complex to write through a single statement. My question if we get large number of records in select stmnt and we have to process them one by one, how can we tackle this optimally, shall we fetch all records in one go in the cursor, say i have 1 million records, cant we process them in batches but all should be processed in one sp
    You can process them in one cursorin a SP, but do the following:

    1. Declare the cursor using "WITH HOLD" and "WITH UR". This make sure that the cursor is not closed when do a commit and that it will minimize lock contention problems. Do not put an order by in the cursor.

    2. After the fetch, use a separate update statment to update the rows you want to change using the primary key. Every 100 -1000 updates, do a COMMIT. This will reduce lock contention and make sure the transaciton logs will not fill up. Make sure you do an extra COMMIT at the end of program.

    This technique can be used on tables with billions of rows.
    M. A. Feldman
    IBM Certified DBA on DB2 for Linux, UNIX, and Windows
    IBM Certified DBA on DB2 for z/OS and OS/390

  5. #5
    Join Date
    Mar 2009
    Posts
    24
    Quote Originally Posted by Marcus_A
    You can process them in one cursorin a SP, but do the following:

    1. Declare the cursor using "WITH HOLD" and "WITH UR". This make sure that the cursor is not closed when do a commit and that it will minimize lock contention problems. Do not put an order by in the cursor.

    2. After the fetch, use a separate update statment to update the rows you want to change using the primary key. Every 100 -1000 updates, do a COMMIT. This will reduce lock contention and make sure the transaciton logs will not fill up. Make sure you do an extra COMMIT at the end of program.

    This technique can be used on tables with billions of rows.
    How do i commit after every 1000 records , how do i keep track of these and what place should i commit

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •