Search Postgresql Archives

Re: Connections dropping while using Postgres backend DB with Ejabberd

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Thanks Michael for the recommendation and clarification.

Will try the with 32 MB on my next run.

BR,
Dipanjan

On Tue, Feb 25, 2020 at 10:51 PM Michael Lewis <mlewis@xxxxxxxxxxx> wrote:
work_mem can be used many times per connection given it is per sort, hash, or other operations and as mentioned that can be multiplied if the query is handled with parallel workers. I am guessing the server has 16GB memory total given shared_buffers and effective_cache_size, and a more reasonable work_mem setting might be on the order of 32-64MB.

Depending on the type of work being done and how quickly the application releases the db connection once it is done, max connections might be on the order of 4-20x the number of cores I would expect. If more simultaneous users need to be serviced, a connection pooler like pgbouncer or pgpool will allow those connections to be re-used quickly.

These numbers are generalizations based on my experience. Others with more experience may have different configurations to recommend.

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]

  Powered by Linux