Search squid archive

RE: Memory Error when using large acl files

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



tis 2006-02-28 klockan 11:18 +0100 skrev Carsten Jensen:
> you can see my config here below
> as to why.. well I don't want my users to surf porn.
> the dstdomain_regex because the file contains fx sex.com
> but the homepage can be www.sex.com

sex.com is NOT a valid regex. Writing a regex to match the sex.com
domain is quite complex.

the dstdom_regex pattern sex.com matches any host name having the
letters s e x followed by any letter followed by the letters c o m,
anywhere in the hostname component of the URL.

the dstdom pattern .sex.com on the other hand is clear, and only mathes
the domain sex.com and all its subdomains (www.sex.com etc).

> 
> If I use .sex.com in the file domains (path below) with the acl dstdomain
> I won't even be able to access www.google.com (where the browser then shows
> page not found)

That is not right. You should look into why the browser says "Page not
found" in this config.

Anything in cache.log?

> acl porn_domains dstdomain_regex "/usr/local/squid/blacklists/porn/domains"
> http_access deny all porn_domains

This list should be cept pretty short. And take your time to study the
regex language when making these patterns.  regex lists is quite memory
hungry and very CPU demanding on largeish lists.

The bulk assuming a list of undesired domains should go inta a dstdomain
acl, which is much much friendlier both to CPU and memory.

Regards
Henrik

Attachment: signature.asc
Description: Detta =?ISO-8859-1?Q?=E4r?= en digitalt signerad meddelandedel


[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux