Search Postgresql Archives

Re: out of memory for query result

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



From my googling, it seems the Perl DBD driver for Postgres does *not* support the cursor (see below). I hope someone can refute this!

I am otherwise looking for code to implement Postgres cursors in Perl. I can not find the "DECLARE CURSOR" defined in the Perl DBI documentation either. Thanks Martijn for your reply, it helped me dig deeper.

The following code does not work, but I'll keep trying! (I just added the declare phrase.)
    $dbh = DBI->connect("DBI:Pg:dbname=$dbName;host=$host",
      $dbUser, $dbPassword,
      { RaiseError => 0, AutoCommit => 0, PrintError => 1 });
    $sth = $dbh->prepare("declare csr cursor for $sqlstatement");
    $sth->execute(@statement_parms) or die $DBI::errstr;
    while (my $hr = $sth->fetchrow_hashref) {
      # do something wonderful
    }
    $sth->finish();
DBD::Pg::st fetchrow_hashref failed: no statement executing

From the DBD-Pg-1.32 module documentation on CPAN...
http://search.cpan.org/~rudy/DBD-Pg-1.32/Pg.pm#Cursors
"Although PostgreSQL has a cursor concept, it has *not* been used in the current implementation. Cursors in PostgreSQL can only be used inside a transaction block. Because only one transaction block at a time is allowed, this would have implied the restriction, not to use any nested SELECT statements. Hence the execute method fetches all data at once into data structures located in the frontend application. This has to be considered when selecting large amounts of data!"

Is this a Perl only restriction? How about Python or Ruby?

Thanks,
Allen

Martijn van Oosterhout wrote:
On Sat, Oct 22, 2005 at 03:46:18PM -0400, Allen wrote:

I am trying to select a result set from a 2-table join, which should be returning 5,045,358 rows. I receive this error:

   DBD::Pg::st execute failed: out of memory for query result


AFAIK, DBD:Pg never uses a cursor unless you ask it to. So you probably
want to code a loop like:

DECLARE CURSOR blah AS ...
while( FETCH 1000 )
{
  process rows...
}

If you don't use a cursor in the backend, then DBI will try to pull the
*entire* result and store it in memory, which is why you don't have
enough...

Hope this helps,

---------------------------(end of broadcast)---------------------------
TIP 3: Have you checked our extensive FAQ?

              http://www.postgresql.org/docs/faq

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux