Search squid archive

Re: limiting connections

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 26/03/2012 23:13, Carlos Manuel Trepeu Pupo wrote:

#!/bin/bash
result=`squidclient -h 192.168.19.19 mgr:active_requests | grep -c "$1"`
if [ $result -eq 0 ]
then
echo 'OK'
else
echo 'ERR'
fi
the code should be something like that:

#!/bin/bash
while read line; do
	result=`squidclient -h 192.168.19.19 mgr:active_requests | grep -c "$line"`
	if [ $result -eq 0 ]
	then
	echo 'OK'
	else
	echo 'ERR'
	fi
done

but as i was looking at the mgr:active_requests in noticed that squid responses very slow and it can take a while to get answer from it sometimes.

Regards,
Eliezer




# If I have the same URI then I denied. I make a few test and it work
for me. The problem is when I add the rule to the squid. I make this:

acl extensions url_regex "/etc/squid3/extensions"
external_acl_type one_conn %URI /home/carlos/script
acl limit external one_conn

# where extensions have:

\.(iso|avi|wav|mp3|mp4|mpeg|swf|flv|mpg|wma|ogg|wmv|asx|asf|deb|rpm|exe|zip|tar|tgz|rar|ppt|doc|tiff|pdf)$

http_access deny extensions limit


So when I make squid3 -k reconfigure the squid stop working

What can be happening ???


* The helper needs to be running in a constant loop.
You can find an example
http://bazaar.launchpad.net/~squid/squid/3.2/view/head:/helpers/url_rewrite/fake/url_fake_rewrite.sh
although that is re-writer and you do need to keep the OK/ERR for external
ACL.

Sorry, this is my first helper, I do not understand the meaning of
running in a constant loop, in the example I see something like I do.
Making some test I found that without this line :
result=`squidclient -h 192.168.19.19 mgr:active_requests | grep -c "$1"`
the helper not crash, dont work event too, but do not crash, so i
consider this is in some way the problem.


* "eq 0" - there should always be 1 request matching the URL. Which is the
request you are testing to see if its>1 or not. You are wanting to deny for
the case where there are *2* requests in existence.

This is true, but the way I saw was: "If the URL do not exist, so
can't be duplicate", I think isn't wrong !!


* ensure you have manager requests form localhost not going through the ACL
test.

I was making this wrong, the localhost was going through the ACL, but
I just changed !!! The problem persist, What can I do ???



Amos



--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
eliezer <at> ngtech.co.il


[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux