Search Postgresql Archives

Re: problem with lost connection while running long PL/R query

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



"David M. Kaplan" <david.kaplan@xxxxxx> writes:
> Thanks for the help.  You have definitely identified the problem, but I 
> am still looking for a solution that works for me.  I tried setting 
> vm.overcommit_memory=2, but this just made the query crash quicker than 
> before, though without killing the entire connection to the database.  I 
> imagine that this means that I really am trying to use more memory than 
> the system can handle?

> I am wondering if there is a way to tell postgresql to flush a set of 
> table lines out to disk so that the memory they are using can be 
> liberated.

Assuming you don't have work_mem set to something unreasonably large,
it seems likely that the excessive memory consumption is inside your
PL/R function, and not the fault of Postgres per se.  You might try
asking in some R-related forums about how to reduce the code's memory
usage.

Also, if by "crash" this time you meant you got an "out of memory" error
from Postgres, there should be a memory map in the postmaster log
showing all the memory consumption Postgres itself is aware of.  If that
doesn't add up to a lot, it would be pretty solid proof that the problem
is inside R.  If there are any memory contexts that seem to have bloated
unreasonably, knowing which one(s) would be helpful information.

			regards, tom lane


-- 
Sent via pgsql-general mailing list (pgsql-general@xxxxxxxxxxxxxx)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general




[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux