Search Postgresql Archives

Re: Monitoring number of backends

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Tue, Oct 22, 2013 at 09:45:24PM -0500, Andy Colson wrote:
> On 10/22/2013 12:59 PM, Stephen Frost wrote:
> >Andy,
> >
> >* andy (andy@xxxxxxxxxxxxxxx) wrote:
> >>My website is about to get a little more popular.  I'm trying to add
> >>in some measurements to determine an upper limit of how many
> >>concurrent database connections I'm currently using.
> >
> >PG is really *much* happier if you have only one backend per CPU in your
> >system.  The way to get there is by using a connection pooler like
> >pg_bouncer and configuring it based on how many CPUs you have.
> >
> >pg_bouncer can also provide stats for you.
> >
> >	Thanks,
> >
> >		Stephen
> >
> 
> Ahh, bummer, man.  PgBouncer doesn't work so well when you have lots of databases.  I have about 90 databases, the website could connect to any one of them on any request.  (They are all about as equally likely to be hit)
> 
> In my pgbouncer.ini I set
> [databases]
> * =
> [pgbouncer]
> max_cleint_conn = 200
> default_pool_size = 20
> 
> but that made each database its own pool, so 90*20 possible (but then maxed out at 200, I assume).  Postgres has only a max_connections of 40, so I started getting "FATAL sorry too many cleints already" errors.
> 
> I set the max_cleint_conn = 40
> and default_pool_size = 3
> 
> so it should stop erroring out, but is this type of setup even going to help me?  If pgbouncer has 40 connections open to 40 different databases, and then a request comes in for a db it doesnt have, it'll have to drop one and build a new connection, yes?  Won't that be slower than just connecting right to postgres in the first place?
> 
> I need a Most-Recently-Used pool, not a per-database pool.  Is there a way to do this with pgbouncer?  (With a large number of databases)

PgBouncer does not do any cross-pool limiting, so you still can have
3*90 server connections open.

The max_client_conn is meant for rescue limit when something is
completely broken and should not be hit in normal work.  40000 would
be reasonable number...

With that many databases, you just need to accept you need few
connections to each db open, otherwise you won't get much win from
pooling.  So Postgres max_connections should be indeed >= 3*90 for you.
And you can limit server connection via server_idle_timeout.

-- 
marko



-- 
Sent via pgsql-general mailing list (pgsql-general@xxxxxxxxxxxxxx)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general




[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux