Results 1 to 6 of 6
  1. #1
    Join Date
    Jun 2011
    Posts
    4

    Unanswered: Fetch large result set from table

    Hi,

    I've got a table that contains 20+ millions records and I'm trying to dump its data to a file. I've tried to use the cursor approach because of the memory issues.

    How do you keep looping on the cursor until it reaches the end of the result set?

    Here's my code

    CREATE OR REPLACE FUNCTION gelargedata()
    RETURNS SETOF mytable AS $$
    DECLARE
    cur refcursor;
    BEGIN
    OPEN cur FOR SELECT * FROM mytable;
    LOOP
    FETCH FORWARD 1000 FROM cur;
    IF not found THEN
    exit;
    ELSE
    RETURN NEXT;
    END IF;
    END LOOP;
    CLOSE cur;
    END;
    $$ LANGUAGE 'PLPGSQL';

    I get this error:

    ERROR: expected FROM or IN at or near "1000"....

    Can one uses FETCH in a loop?

    Any help if very much appreciated.


    Eric

  2. #2
    Join Date
    Jun 2011
    Posts
    4
    Anything guys?

  3. #3
    Join Date
    Mar 2011
    Posts
    27
    Not very familiar with code like that, but can I ask why you don't just use pg_dump or psql to create a flat file?

  4. #4
    Join Date
    Jun 2011
    Posts
    4
    I did try using pg_dump:

    pg_dump -Oa -F p -U myuser -t mytable mydb
    But I will need to extract my data between the two lines matching "COPY mytable ..." & "\."

    I'd love to see how it's done using cursors.

    Eric

  5. #5
    Join Date
    May 2008
    Posts
    277
    Quote Originally Posted by waltari2001 View Post
    I've got a table that contains 20+ millions records and I'm trying to dump its data to a file. I've tried to use the cursor approach because of the memory issues.
    Quote Originally Posted by waltari2001 View Post
    But I will need to extract my data between the two lines matching "COPY mytable ..." & "\."
    It's difficult to figure out exactly what you're trying to do.

    If you're attempting to dump the database, pg_dump is the tool to use. The documentation provides guidance on how to deal with large databases. Dumping the database to a compressed or custom archive give you lots of options for restoring the database using pg_restore.

    If you're attempting to "export" a specific table's data to a csv file, then you can write a copy script:
    Code:
    copy table mytable to stdout with csv header;
    You can then pipe the output to a file with psql using the strategies outlined above.

    Anything with a cursor or custom function, I'm guessing, would be orders of magnitude slower.

  6. #6
    Join Date
    Jun 2011
    Posts
    4
    Thanks for the feedback futurity!.

    I'm still very interested in figuring out a way to navigate a large table using cursors.
    Last edited by waltari2001; 07-01-11 at 07:56.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •