Search squid archive

Re: Concurrent Connection Limit

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



I recompiled squid and re-installed, and cahce log showed the 2048 file
descriptors there, but z19 still didnt work quite right - showed about
1400 network conenctions.

so i rebuilt for 25088 file descriptors (way over kill hehe).

recompiled again and now its live, i dont know if its perfect but right
now there are (and im not kidding) its now showing 29322 network sockets
open right now.


Im baffled.  Somethign doesnt jive.    And the cpu spike to 6.0, with that
many sockets opened.

so i set ulimit to 999999 (1 million basicaly) and rebuild squid again.

According ot cache.log it locks at 32768 file descriptors, it wont load
the 9999999, it appears a hard limit in deed!  Anyway netstat -vatn shows
only 3352 open sockets right now but z19 isnt repsonding well, and within
a few minutes i stoped it and went back to http mode...

what should i do the next?

On 7/12/05, Rafael.Almeida@xxxxxxxxxx <Rafael.Almeida@xxxxxxxxxx> wrote:
> Have you checked if your File descriptors? Each socket uses one file
> descriptor and there is a limit of 1024 on Linux by default. You can
> increase it if yopu need.
> Take a look here:
> http://www.onlamp.com/pub/a/onlamp/2004/03/25/squid.html?page=2
> 
> Hope that helps.
> 
> Rafael Sarres de Almeida
> Seção de Gerenciamento de Rede
> Superior Tribunal de Justiça
> Tel: (61) 319-9342
> 
> 
> 
> 
> 
> Jeffrey Ng <jeffreyn@xxxxxxxxx>
> 11/07/2005 15:12
> Favor responder a
> Jeffrey Ng <jeffreyn@xxxxxxxxx>
> 
> 
> Para
> squid-users@xxxxxxxxxxxxxxx
> cc
> 
> Assunto
> Re:  Concurrent Connection Limit
> 
> 
> 
> 
> 
> 
> Hello? does anybody know what's wrong?..
> 
> On 7/10/05, Joshua Goodall <joshua@xxxxxxxxxxxxxx> wrote:
> > On Sun, Jul 10, 2005 at 02:04:36PM +0800, Jeffrey Ng wrote:
> > > Hi, I have problem with squid web accelerator on my site. My site is a
> > > photo sharing site like webshots. It has a pretty busy load, so I
> > > decided that squid may be able to sooth the load of my image server by
> > > caching some of the images. We have set everything up and it uses 1GB
> > > RAM. It was fine at first. But suddenly all the images stopped loading
> > > after 6 hours. I checked netstat and found that there are 1000
> > > connections from outside. and squid stops responding whenever the
> > > connections hit that number. I am pretty sure that squid has a
> > > concurrent connection limit of 1000. How could I increase that limit?
> > > Any help is appreaciated. Thank you!
> >
> > Sounds like you're running out of filedescriptors.
> > See http://www.squid-cache.org/Doc/FAQ/FAQ-11.html#ss11.4
> >
> > - Joshua.
> >
> > --
> > Joshua Goodall                           "as modern as tomorrow
> afternoon"
> > joshua@xxxxxxxxxxxxxx                                       - FW109
> >
> 
> 
>


[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux