Search squid archive

Re: High CPU Utilization

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



> From: Amos Jeffries <squid3@xxxxxxxxxxxxx>
> Date: Tue, 20 Oct 2009 00:14:58 +1300
> To: Ross Kovelman <rkovelman@xxxxxxxxxxxxxxxx>
> Cc: "squid-users@xxxxxxxxxxxxxxx" <squid-users@xxxxxxxxxxxxxxx>
> Subject: Re:  High CPU Utilization
> 
> Ross Kovelman wrote:
>>> From: Amos Jeffries <squid3@xxxxxxxxxxxxx>
>>> Date: Mon, 19 Oct 2009 18:14:33 +1300
>>> Cc: "squid-users@xxxxxxxxxxxxxxx" <squid-users@xxxxxxxxxxxxxxx>
>>> Subject: Re:  High CPU Utilization
>>> 
>>> Ross Kovelman wrote:
>>>> Any reason why I would have high CPU utilization, avg around 90%?  I did
>>>> build it for PPC although I do have a large dstdomain list which contains
>>>> URL's that are not allowed on the network.  It is a Mac G4 dual 1.33.  This
>>>> is with no load, or I should say no users on the network.
>>>> 
>>>> Thanks
>>>> 
>>> Could be a few things:
>>> 
>>>   * bug 2541 (except latest 3.0 and 3.1 releases)
>>> 
>>>   * lots of regex patterns
>>> 
>>>   * garbage collection of the various caches
>>> 
>>>   * UFS storage system catching up after a period of load
>>> 
>>>   * memory swapping
>>> 
>>>   * RAID
>>> 
>>>   * ... any combination of the above.
>>> 
>>> If you have the strace tool available you can look inside Squid and see.
>>>   Or a use "squid -k debug" to toggle full debug on/off for a short
>>> period and troll the cache.log afterwards.
> 
> 
>> Amos,
>> 
>> I am not using a raid, although my single drive performance might be slow?
>> Will need to check on the i/o.  When I do run squid or make any changes to
>> the config I do get a lot of :
>> 
>> 2009/10/16 14:44:08| WARNING: You should probably remove 'xxx.com' from the
>> ACL named 'bad_url'
>> 2009/10/16 14:44:08| WARNING: 'xxx.com' is a subdomain of 'xxx.com'
>> 2009/10/16 14:44:08| WARNING: because of this 'xxx.com' is ignored to keep
>> splay tree searching predictable
>> 2009/10/16 14:44:08| WARNING: You should probably remove 'xxx.com' from the
>> ACL named 'bad_url'
>> 
>> Would this by chance do it?  There is about 22,000 sites in the bad_url
>> file.
> 
> I don't think so. Those warnings are produced by Squid as it prunes them
> out of the ACL by itself.
> 
> You can get rid of the duplicates and sub-domains manually to reduce the
> warnings.
> 
> 
> Amos
> -- 
> Please be using
>    Current Stable Squid 2.7.STABLE7 or 3.0.STABLE19
>    Current Beta Squid 3.1.0.14

Amos,

Looked to be a permission issue as Squid would crash and restart.

Thanks

<<attachment: smime.p7s>>


[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux