Results 1 to 2 of 2
  1. #1
    Join Date
    Sep 2003
    Posts
    2

    Unanswered: Handling large selects

    I'm using perl DBI on Linux with postgresql. The problem is that when I select a large number of records (> 50M or so), the system runs out of memory (and swap), which causes the machine to crash. There's already 1G RAM in this thing, and adding more isn't really going to help, as the data can get bigger than that anyway.

    I've watched the processes using top, and have also found documentation that the DB will return its entire result set to DBI before you can begin processing it. This is what is killing the RAM.

    How do you handle this situation? Make smaller SELECTs until the system can handle it?

    Thanks.

  2. #2
    Join Date
    Sep 2001
    Location
    Vienna
    Posts
    400

    cursors

    We are working now on the same problem
    Working with cursors is your only solution.
    http://www.postgresql.org
    --Postgresql is the only kind of thing--

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •