Search squid archive

Re: Concurrent Connection Limit

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Have you checked if your File descriptors? Each socket uses one file 
descriptor and there is a limit of 1024 on Linux by default. You can 
increase it if yopu need.
Take a look here:
http://www.onlamp.com/pub/a/onlamp/2004/03/25/squid.html?page=2 

Hope that helps.

Rafael Sarres de Almeida
Seção de Gerenciamento de Rede
Superior Tribunal de Justiça
Tel: (61) 319-9342





Jeffrey Ng <jeffreyn@xxxxxxxxx> 
11/07/2005 15:12
Favor responder a
Jeffrey Ng <jeffreyn@xxxxxxxxx>


Para
squid-users@xxxxxxxxxxxxxxx
cc

Assunto
Re:  Concurrent Connection Limit






Hello? does anybody know what's wrong?..

On 7/10/05, Joshua Goodall <joshua@xxxxxxxxxxxxxx> wrote:
> On Sun, Jul 10, 2005 at 02:04:36PM +0800, Jeffrey Ng wrote:
> > Hi, I have problem with squid web accelerator on my site. My site is a
> > photo sharing site like webshots. It has a pretty busy load, so I
> > decided that squid may be able to sooth the load of my image server by
> > caching some of the images. We have set everything up and it uses 1GB
> > RAM. It was fine at first. But suddenly all the images stopped loading
> > after 6 hours. I checked netstat and found that there are 1000
> > connections from outside. and squid stops responding whenever the
> > connections hit that number. I am pretty sure that squid has a
> > concurrent connection limit of 1000. How could I increase that limit?
> > Any help is appreaciated. Thank you!
> 
> Sounds like you're running out of filedescriptors.
> See http://www.squid-cache.org/Doc/FAQ/FAQ-11.html#ss11.4
> 
> - Joshua.
> 
> --
> Joshua Goodall                           "as modern as tomorrow 
afternoon"
> joshua@xxxxxxxxxxxxxx                                       - FW109
>




[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux