Re: Running PostgreSQL as fast as possible no matter the consequences

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



"Lello, Nick" <nick.lello@xxxxxxxxxxxxxxx> writes:
> A bigger gain can probably be had if you have a tightly controlled
> suite of queries that will be run against the database and you can
> spend the time to tune each to ensure it performs no sequential scans
> (ie: Every query uses index lookups).

Given a fixed pool of queries, you can prepare them in advance so that
you don't usually pay the parsing and planning costs. I've found that
the planning is easily more expensive than the executing when all data
fits in RAM.

Enter pgbouncer and preprepareÂ:
  http://wiki.postgresql.org/wiki/PgBouncer
  http://preprepare.projects.postgresql.org/README.html

Regards,
-- 
Dimitri Fontaine
http://2ndQuadrant.fr     PostgreSQL : Expertise, Formation et Support

-- 
Sent via pgsql-performance mailing list (pgsql-performance@xxxxxxxxxxxxxx)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-performance



[Postgresql General]     [Postgresql PHP]     [PHP Users]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Yosemite]

  Powered by Linux