Unanswered: DB2 Cursors - Processing huge data - I APOLOGIZE
I apologize with all forum users because I found the problem, and it was an error in my code. Apologize with sinwar first of all, because he was right. I don't know if I'm strong enough to say what was my error... ok, I'll tell you, but please don't laugth of me until the end of time: it was a counter, declared as SMALLINT... I'm sorry, once again...
DB2 8.2.0 on WinXP
1 table with about 45 milion records
1 cursor to process data
After 32768 records the cursor explodes
Commit of the processed data closes the cursor and if I re-open it without conditions I re-process already processed data
Questions about possible solutions:
1. How to delete processed data using the same cursor? (Now declared FOR FETCH ONLY)
2. How to enlarge cursor capacity in order to gain more than 32768 records at time?
i think i didn't explained very well my problem...
i have a table with 45 milion: i have to process contained data using an SP, with a declared cursors like this:
DECLARE CURSOR_FLUSSO CURSOR FOR
FOR FETCH ONLY;
then, i have loop for fetching this cursor:
FETCH CURSOR_FLUSSO INTO CURR_RECORD;
IF SQLCODE <> 0 THEN
END LOOP PARSE_LOOP;
1. Without any commit my FETCH INTO gives error after 32768 records processed.
2. With commit every 1000 or 10000 records, the cursor will be closed and if i re-open it i read the same data i have just commited.
I found something about DELETE WHERE CURRENT OF <CURSOR> on internet, but without any result on using it. This could help me on reopening cursor without having the old-already-processed data in it.