Search Postgresql Archives

Re: Setting up a database for 10000 concurrent users

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 




I think you're being horribly optimistic if you actually want 10000 concurrent connections, with users all doing things. Even if you only allow 1MB for each connection that's 10GB of RAM you'd want. Plus a big chunk more to actually cache your database files and do work in. Then, if you had 10,000 concurrent queries you'd probably want a mainframe to handle all the concurrency, or perhaps a 64-CPU box would suffice...

You probably want to investigate connection pooling, but if you say what you want to achieve then people will be able to suggest the best approach.


I know I'm on thin ice :)

Actually it was a max limit, I want to test how far I can tweak the server.
The clients are doing almost nothing most of the time, maybe one insert
every 2 minutes. Of course that is still more than 80 inserts per second.

I'm connecting the database via JDBC where connection pooling is possible and also considered.

I haven't been able to find how much memory I can expect the client to consume, so I thought testing was more accurate than calculating.
Is it really necessary with 1MB RAM for one connection ?


Poul

---------------------------(end of broadcast)---------------------------
TIP 6: explain analyze is your friend

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux