Search squid archive

Re: Concurrent Connection Limit

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



I tried your suggestion and double checked my server just now. The
server hits 1030'ish total network sockets and then squid stopped
responding to port 80 [as if its network soctet limit is 1024].

File descriptors would be if we had cache objects writing as fast as
network sockets (we dont), but i did up the limit on file descriptiors
to 2048 to 'test' to see if that became the new magic number to halt
port 80 traffic, and it still locked up at 1030 im afraid. What else
can be the problem?

Thank you!

Jeffrey Ng

On 7/10/05, Joshua Goodall <joshua@xxxxxxxxxxxxxx> wrote:
> On Sun, Jul 10, 2005 at 02:04:36PM +0800, Jeffrey Ng wrote:
> > Hi, I have problem with squid web accelerator on my site. My site is a
> > photo sharing site like webshots. It has a pretty busy load, so I
> > decided that squid may be able to sooth the load of my image server by
> > caching some of the images. We have set everything up and it uses 1GB
> > RAM. It was fine at first. But suddenly all the images stopped loading
> > after 6 hours. I checked netstat and found that there are 1000
> > connections from outside. and squid stops responding whenever the
> > connections hit that number. I am pretty sure that squid has a
> > concurrent connection limit of 1000. How could I increase that limit?
> > Any help is appreaciated. Thank you!
> 
> Sounds like you're running out of filedescriptors.
> See http://www.squid-cache.org/Doc/FAQ/FAQ-11.html#ss11.4
> 
> - Joshua.
> 
> --
> Joshua Goodall                           "as modern as tomorrow afternoon"
> joshua@xxxxxxxxxxxxxx                                       - FW109
>


[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux