Ok so the solution is to limit the number of connections. But it seems
that there is no good way to choose the ideal number of connections as I
don't know how much RAM will a connection use.
If a connection takes 3MB (on windows I see the process in the Process
Monitor, in Linux the RSS is more like 5MB) just after creation can I
limit its growth potential so that I know that it will not 6,10, 20MB
under some circumstances.
Not knowing how much RAM can take the server is annoying. You have to be
extra careful and scale the server down as you don't know what will happen.
Richard Huxton wrote:
James Im wrote:
Richard Huxton wrote:
> Is there a particular problem you're trying to solve?
yes I'm trying to know how many connections can open to the database
without running out of memory. Ideally I would like to optimize stuff
so that I can open the maximum number of connection/session.
What - just as many idle connections as you can? You're not interesting
in running queries with them?
In total I can give 250Mb of RAM to Postgresql. It should not eat more
memory. This is what Im' trying to do.
It hasn't got that sort of hard limit facility. The best you can do is
to limit the maximum number of connections and then restrict the various
memory settings per-backend and shared. That should let you keep it in
the range you want.
_________________________________________________________________
Vælg selv hvordan du vil kommunikere - skrift, tale, video eller billeder
med MSN Messenger: http://messenger.msn.dk/ - her kan du det hele