Results 1 to 7 of 7
  1. #1
    Join Date
    Jun 2003
    Posts
    1

    Red face Unanswered: bulk collect error

    We bring in data (One particular table is 2 columns --- number --- of 30 millions of records) from one server,
    then process it and pass it to another(30 millions of records reduces to 21 millions destination table has same attributes/constraints/indexes).

    I was trying to use bulk collect and forall function available in ORACLE.

    If I insert into a local table bulk collect works fine
    If I insert into a remote table (over db link having same columns/attributes/constraints/indexes) it returns an error
    ---
    table v_fa
    iid number(7) not null;
    fid number(7) not null;


    table v_fa@my_schema
    iid number(7) not null;
    fid number(7) not null;


    ------------Error
    Error Message ---

    Error -1400: ORA-01400: cannot insert NULL into ("my_schema"."V_FA"."E_ID")
    ORA-02063: preceding line from my_schema
    ORA-01400: cannot insert NULL into ("my_schema"."V_FA"."E_ID")
    ORA-02063: preceding line from my_schema
    ORA-06512: at line 15

    ------------
    If I remove the constraints from remote table record count matches to number of records those should have been inserted, but all values are null.

    Here is the piece of code that I am working on...
    ---------------
    OPEN c_cele;

    LOOP
    FETCH c_selec BULK COLLECT INTO e_records , d_records LIMIT 1000;

    FORALL i IN e_records.FIRST..e_records.COUNT
    INSERT INTO v_fa@my_schema(e_id, d_id)
    VALUES (e_records(i)), d_records(i));
    EXIT WHEN c_selec%NOTFOUND;
    COMMIT;
    END LOOP;


    --------------- End of code
    Any solution to this problem?

    Thanks in advance,

    Vish

  2. #2
    Join Date
    Jun 2003
    Location
    West Palm Beach, FL
    Posts
    2,713
    Did you try moving the 'EXIT'?

    OPEN c_cele;

    LOOP
    FETCH c_selec BULK COLLECT INTO e_records , d_records LIMIT 1000;
    EXIT WHEN c_selec%NOTFOUND;

    FORALL i IN e_records.FIRST..e_records.COUNT
    INSERT INTO v_fa@my_schema(e_id, d_id)
    VALUES (e_records(i)), d_records(i));
    COMMIT;
    END LOOP;

  3. #3
    Join Date
    May 2003
    Posts
    87
    Can you try doing a simple insert from SQL*Plus ? Say, does this sql stmt does the proper inserts ?

    Code:
    INSERT INTO v_fa@my_schema
      select e_id, d_id from v_fa
      where rownum < 10 ;

  4. #4
    Join Date
    May 2003
    Posts
    87
    No, that wont work ! Moving the exit after the fetch for bulk fetch will not insert the last iteration of the fetch !!

    Thats a big mistake most of them do while converting a simple fetch loop into a bulk fetch loop. The "exit when" condition must be at the end, before the end loop.

    Originally posted by LKBrwn_DBA
    Did you try moving the 'EXIT'?

    OPEN c_cele;

    LOOP
    FETCH c_selec BULK COLLECT INTO e_records , d_records LIMIT 1000;
    EXIT WHEN c_selec%NOTFOUND;

    FORALL i IN e_records.FIRST..e_records.COUNT
    INSERT INTO v_fa@my_schema(e_id, d_id)
    VALUES (e_records(i)), d_records(i));
    COMMIT;
    END LOOP;

  5. #5
    Join Date
    May 2003
    Posts
    87
    Also, would it help using the SQL*Plus copy command ???

  6. #6
    Join Date
    Sep 2002
    Location
    Austria
    Posts
    37

    Re: bulk collect error

    not really sure, but why

    FORALL i IN e_records.FIRST..e_records.COUNT

    and not:

    FORALL i IN e_records.FIRST..e_records.LAST
    -----------------------------------------------*****

  7. #7
    Join Date
    Jun 2003
    Location
    Sydney, Australia
    Posts
    66

    Re: bulk collect error

    Vish,

    there is an often-discussed limitation preventing Bulk Collect from working over a remote link. Thomas Kyte proposes a workaround based on different types of arrays. ( the following URL should be a single line):
    http://asktom.oracle.com/pls/ask/f?p...1275507423677,


    I realise your problem is different: in your case it is the FORALL operation going through the link, while in our project we had BULK COLLECT over the link. In other words, we were inserting on the local site, and the data was originating from the remote site. Perhaps you could do the same, and then you could use the following simple workaround - this is different from what T.Kyte suggests.

    The idea is to shift the driving site i.e. the execution venue for the bulk operation. In case of Select, this involves joining the remote table with a dummy local table which had a single row. The effect was that the query was executed locally, and bulk collect worked just fine. Of course, we could only select from a single remote, to avoid nested loops over the network. From the top of my head, the code looked like:

    SELECT rem.col1, rem.col2
    BULK COLLECT into tbl1, tbl2
    FROM dummy_local, remote@lnk.word rem
    WHERE rem.conditions = such_and_such ;

    FORALL j in tbl1.first..tbl1.last
    INSERT INTO local_table ( c1, c2)
    VALUES tbl1(j), tbl2(j)
    ;

    For a remote insert, the idea still needs more work, but I hope it helps,

    Andrew Schonberger

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •