Search Postgresql Archives

Re: is there a way to firmly cap postgres worker memory consumption?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Tue, Apr 8, 2014 at 6:32 PM, Tom Lane <tgl@xxxxxxxxxxxxx> wrote:
Well, here's the problem:
>         ExprContext: 812638208 total in 108 blocks; 183520 free (171
> chunks); 812454688 used

So something involved in _expression_ evaluation is eating memory.
Looking at the query itself, I'd have to bet on this:

>            ARRAY_TO_STRING(ARRAY_AGG(MM.ID::CHARACTER VARYING), ',')

My guess is that this aggregation is being done across a lot more rows
than you were expecting, and the resultant array/string therefore eats
lots of memory.  You might try replacing that with COUNT(*), or even
better SUM(LENGTH(MM.ID::CHARACTER VARYING)), just to get some definitive
evidence about what the query is asking to compute.

The devs have moved on and want to stick with their new query, so I'll just chalk this up to a bad query and let it go. But I'm glad to have learned a few new tricks, thanks.
 
Meanwhile, it seems like ulimit -v would provide the safety valve
you asked for originally.

Thank you Amador and Tom for the ulimit solution, that's exactly what I needed.

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux