Re: Setting up of a large database

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Fri, May 16, 2008 at 11:50 AM, Roberto Edwins
<robertoedwins@xxxxxxxxx> wrote:
> What are the values that should go in postgresql.conf for a larga
> database, a couple of which tables shall contain over 100 million
> records?

Well, that really kind of depends.  Is this a database to handle lots
of users with small bank style transactions?  Or is it designed to
hold the text type content for millions of users but with only a
handful accessing it at once?  Or maybe it's a reporting database with
lots of statistical data that will be accessed by one or two people at
a time but do huge ugly queries.

It really all kind of depends.

A good idea is to keep from setting max clients any higher than you
have to.  Set shared_buffers around 25% of total memory to start, and
set work_mem to about 8M or so.  Note that it's easy to use up a lot
of memory fast with higher work_mem settings because each sort / hash
agg etc can use up work_mem worth of memory.  100 users times 2 sorts
each would equal 8M*200 or 1600M with a setting of 8M work_mem.

Most importantly, make small incremental changes and benchmark with a
realistic load, it's the only way to be sure.


[Index of Archives]     [KVM ARM]     [KVM ia64]     [KVM ppc]     [Virtualization Tools]     [Spice Development]     [Libvirt]     [Libvirt Users]     [Linux USB Devel]     [Linux Audio Users]     [Yosemite Questions]     [Linux Kernel]     [Linux SCSI]     [XFree86]

  Powered by Linux