Search squid archive

Re: All url_rewriter processes are busy x Too many open files

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Wed, Apr 2, 2008 at 1:18 AM, Amos Jeffries <squid3@xxxxxxxxxxxxx> wrote:

> > Is there any way to increase SQUID_MAXFD from 8192 to 65536, so I can
> > try using the sugested number of url_rewriter processes?
> >
>
>  Squid 2.6: --with-maxfd=65536
>  Squid 3.x: --with-filedescriptors=65536

At the time I was not sure this would work, but I recompiled squid
with this option and it's working now.

>  For our info, you say you are handling thousands of users;
>   and what release of squid is it?

Gentoo Linux 2007.0
Kernel 2.6.20.14
TProxy 2.0.6
Squid 2.6.STABLE17

>   what request/sec load is your squid maxing out at?

   Number of clients accessing cache:      3493
   Average HTTP requests per minute since start:   1882.1
   client_http.requests = 367.345547/sec
   Maximum number of file descriptors:   16384
   Largest file desc currently in use:   13115
   Number of file desc currently in use: 12885
   Available number of file descriptors: 3499
   cpu_usage = 20.541035%

>  Please use Squid 2.6STABLE19 or 3.0STABLE4

I had lots of problems matching kernel, squid and tproxy versions, but
I will try to upgrade to 2.6.STABLE19.

Thanks for your help,
Marcio.

[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux