Search squid archive

Re: error: url_rewriter

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Tue, 13 Oct 2009 23:18:07 -0400, Ross Kovelman
<rkovelman@xxxxxxxxxxxxxxxx> wrote:
> Hello,
> I have an error showing:
> 2009/10/13 22:41:05| WARNING: All url_rewriter processes are busy.
> 2009/10/13 22:41:05| WARNING: up to 1 pending requests queued
> 2009/10/13 22:43:22| WARNING: All url_rewriter processes are busy.
> 2009/10/13 22:43:22| WARNING: up to 6 pending requests queued
> 2009/10/13 22:43:22| Consider increasing the number of url_rewriter
> processes to at least 7 in your config file.
> 2009/10/13 22:43:54| WARNING: All url_rewriter processes are busy.
> 2009/10/13 22:43:54| WARNING: up to 1 pending requests queued
> 2009/10/13 22:43:54| Consider increasing the number of url_rewriter
> processes to at least 2 in your config file.
> 2009/10/13 22:44:42| WARNING: All url_rewriter processes are busy.
> 2009/10/13 22:44:42| WARNING: up to 3 pending requests queued
> 2009/10/13 22:44:42| Consider increasing the number of url_rewriter
> processes to at least 4 in your config file.
> 2009/10/13 23:11:16| WARNING: All url_rewriter processes are busy.
> 2009/10/13 23:11:16| WARNING: up to 1 pending requests queued
> 
> How do I resolve this?  I saw some things from the list and on other
> websites but those people had a lot more people accessing the server
then I
> do.  I am on squid 2.7

They also probably had dozens of helpers running. You have precisely one
by the looks of it.

You have hit the same problem anyway. The solutions are the same.

 * find out if the re-writer is working too slowly and might be sped up.
 * follow Squids advice and up the number of running helpers.
 * add some url_rewrite_access rules to bypass re-writer for requests when
its not actually needed.

Amos

[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux