Search squid archive

Re: High Memory Usage

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Sekar wrote:
Hello All,

I have a list of urls to be blocked in a text file and the size of it is 4MB. Squid has been configured to use this to match the requested urls and configuration is given below.

Hi Sekar,

A text file of size 4 MB! It must contain at least 30 thousand entries.


acl blocked url_regex '/usr/local/squid/blocked.txt'
http_access deny blocked

This is working perfectly as expected. But squid uses high percentage of memory. Because of that, we have lot of swapping process going on and it also takes considerable CPU usage. I suppose it is something to do with "url_regex".

Using dstdomain in place of url_regex if applicable will probably reduce your overall CPU usage.



Please guide me to solve this memory usage issue.

Which OS and Squid version are you using? How much physical memory does your server have?

What is your setting for your cache_dir in your squid.conf?
You can try reducing your L1 and L2 directory sizes.

As far as I know, you can reduce the memory usage slightly by reducing your cache_mem setting.

What's the cache_mem setting in your squid.conf?

I generally use the following on a P4 machine with 1 GB of memory:

cache_mem 64 MB

but your needs and mileage may differ.

You can try using the following directives:

client_persistent_connections off
half_closed_clients off
client_db off

You can also try to reduce memory usage with the tcp_recv_bufsize directive.

Regarding further details of memory usage, you can have a look at some great stuff in the Squid FAQ section below:

http://wiki.squid-cache.org/SquidFaq/SquidMemory

By the way, if nothing helps you, you may need to increase your physical memory:)

Thanking you...




Thanks in Advance,
Sekar





[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux