Search squid archive

Re: error: url_rewriter

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Wed, 14 Oct 2009 00:33:58 -0400, Ross Kovelman
<rkovelman@xxxxxxxxxxxxxxxx> wrote:
> Sounds good, but what if this is for all sites?  I was only testing
> Facbook,
> Myspace, CNN, etc.  I did not go to you tube at all.
> 
> Thanks

YouTube bit is irrelevant  ...

> 
>> From: Chudy Fernandez <chudy_fernandez@xxxxxxxxx>
>> Date: Tue, 13 Oct 2009 21:26:42 -0700 (PDT)
>> To: Ross Kovelman <rkovelman@xxxxxxxxxxxxxxxx>
>> Subject: Re:  error: url_rewriter
>> 
>> about the concurrency this will give you the idea.
>>
http://wiki.squid-cache.org/ConfigExamples/DynamicContent/YouTube/Discussion
>> 

The better reference is:

http://wiki.squid-cache.org/Features/Redirectors#How_do_I_make_it_concurrent.3F


>> 
>> ----- Original Message ----
>>> From: Ross Kovelman <rkovelman@xxxxxxxxxxxxxxxx>
>>> To: Chudy Fernandez <chudy_fernandez@xxxxxxxxx>;
>>> "squid-users@xxxxxxxxxxxxxxx" <squid-users@xxxxxxxxxxxxxxx>
>>> Sent: Wed, October 14, 2009 12:21:57 PM
>>> Subject: Re:  error: url_rewriter
>>> 
>>> This is what I have related to "children"
>>> 
>>> redirect_children 1


The redirect_* options are obsolete. They are now re-named to
url_rewrite_*

That should now be "url_rewrite_children 1" though 5 is the default and
you may want to use that.

Amos


>>> auth_param basic children 5
>>> 
>>> Thanks
>>> 
>>>> From: Chudy Fernandez
>>>> Date: Tue, 13 Oct 2009 20:49:50 -0700 (PDT)
>>>> To: Ross Kovelman
>>>> Subject: Re:  error: url_rewriter
>>>> 
>>>> 2 ways to solve thats increase children or
>>>> change your rewriter to accept parallel and increase
>>>> concurrency.(which is
>>>> better)
>>>> 
>>>> 
>>>> 
>>>> ----- Original Message ----
>>>>> From: Ross Kovelman
>>>>> To: "squid-users@xxxxxxxxxxxxxxx"
>>>>> Sent: Wed, October 14, 2009 11:18:07 AM
>>>>> Subject:  error: url_rewriter
>>>>> 
>>>>> Hello,
>>>>> I have an error showing:
>>>>> 2009/10/13 22:41:05| WARNING: All url_rewriter processes are busy.
>>>>> 2009/10/13 22:41:05| WARNING: up to 1 pending requests queued
>>>>> 2009/10/13 22:43:22| WARNING: All url_rewriter processes are busy.
>>>>> 2009/10/13 22:43:22| WARNING: up to 6 pending requests queued
>>>>> 2009/10/13 22:43:22| Consider increasing the number of url_rewriter
>>>>> processes to at least 7 in your config file.
>>>>> 2009/10/13 22:43:54| WARNING: All url_rewriter processes are busy.
>>>>> 2009/10/13 22:43:54| WARNING: up to 1 pending requests queued
>>>>> 2009/10/13 22:43:54| Consider increasing the number of url_rewriter
>>>>> processes to at least 2 in your config file.
>>>>> 2009/10/13 22:44:42| WARNING: All url_rewriter processes are busy.
>>>>> 2009/10/13 22:44:42| WARNING: up to 3 pending requests queued
>>>>> 2009/10/13 22:44:42| Consider increasing the number of url_rewriter
>>>>> processes to at least 4 in your config file.
>>>>> 2009/10/13 23:11:16| WARNING: All url_rewriter processes are busy.
>>>>> 2009/10/13 23:11:16| WARNING: up to 1 pending requests queued
>>>>> 
>>>>> How do I resolve this?  I saw some things from the list and on other
>>>>> websites but those people had a lot more people accessing the server
>>>>> then I
>>>>> do.  I am on squid 2.7
>>>>> 
>>>>> Thanks
>>>> 
>>>> 
>>>> 
>>>>      
>> 
>> 
>> 
>>

[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux