Search Postgresql Archives

Re: 8.2.4 signal 11 with large transaction

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Bill Moran <wmoran@xxxxxxxxxxxxxxxxxxxxxxx> writes:
> Oddly, the query succeeds if it's fed into psql.

> I'm now full of mystery and wonder.  It would appear as if the
> underlying problem has something to do with PHP, but why should this
> cause a backend process to crash?

Ah, I see it.  Your PHP script is sending all 30000 INSERT commands
to the backend *in a single PQexec*, ie, one 37MB command string.
psql won't do that, it splits the input at semicolons.

Unsurprisingly, this runs the backend out of memory.  (It's not the
command string that's the problem, so much as the 30000 parse and plan
trees...)

Unfortunately, in trying to prepare the error message, it tries to
attach the command text as the STATEMENT field of the log message.
All 37MB worth.  And of course *that* gets an out-of-memory error.
Presto, infinite recursion, broken only by stack overflow (= SIGSEGV).

It looks like 8.1 and older are also vulnerable to this, it's just that
they don't try to log error statement strings at the default logging
level, whereas 8.2 does.  If you cranked up log_min_error_statement
I think they'd fail too.

I guess what we need to do is hack the emergency-recovery path for
error-during-error-processing such that it will prevent trying to print
a very long debug_query_string.  Maybe we should just not try to print
the command at all in this case, or maybe there's some intermediate
possibility like only printing the first 1K or so.  Thoughts?

			regards, tom lane


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux