It depends on your data but what you need to do is to pick a column which has roughly even data distribution. If it has then divde the max-min by the number of buckets you require so if you have a million records and say you want to process roughly 10000 records at a time then in psuedo code
select max(x), min(x) into max_x, min_x from table;
for i = min_x to max_x step bucketsize
select ... from table where x between i and i+(bucketsize-1)
Assumptions: if you commit within the loop then you will be running n transactions so if you run it on a table which is being updated you may find that some records wont be processed even if they were added and committed before your procedure finishes.