On Mon, 21 Nov 2011 16:44:04 +0100, Leonardo wrote:
<snip>
I'd also look at what Squirm is doing and try to reduce a few things
...
* the number of helper lookups. With url_rewrite_access directive
ACLs
* the work Squid does handling responses. By sending empty response
back
for "no-change", and using 3xx redirect responses instead of
re-write
responses.
You may also be able to remove some uses of Squirm entirely by using
deny_info redirection.
I use Squirm uniquely to force SafeSearch on Google via these regex
patterns:
regexi ^(http://www\.google\..*/search\?.*) \1&safe=active
regexi ^(http://www\.google\..*/images\?.*) \1&safe=active
Also in the squid.conf:
acl toSquirm url_regex ^http://www\.google\..*/(search|images)\?
url_rewrite_access allow toSquirm
url_rewrite_access deny all
... will make Squid not even bother to send URLs to Squirm if they wont
be changed. Meaning your total traffic can be higher and not bottleneck
at the URL-rewrite step.
Hmmm... now I am wondering whether I could achieve the same effect
through a Perl script to call via redirect_program...
You could. Squirm is faster than the perl alternatives IIRC.
Amos