Per Jessen wrote:
Chris wrote:
Err yes - a query replied to out of cache will take less time to
complete, therefore the connection will be given up faster, therefore
less _concurrent_ connections.
I guess my idea of concurrency is different to yours.
http://en.wiktionary.org/wiki/concurrent
"Happening at the same time; simultaneous."
100 people come to your website - that's still going to be 100
connections to the database, regardless of where the results come
from.
But if that is one every hour for a hundred hours, your max concurrency
is 1.
I'm talking about "now" - not over any time span. 100 connections at the
same time. If you get 100 people to your website at exactly the same
time, the query cache may be used - but you're still going to have 100
connections to your database.
The OP wanted to cut that down. In which case the answer is to cache
things server side using something like memcache or even one of the pear
modules.
--
Postgresql & php tutorials
http://www.designmagick.com/
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php