Search squid archive

Re: Filters in terminal but not in Browser

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 2/12/2011 8:49 a.m., Paul Crown wrote:
On 11/30/2011 06:41 PM, Amos Jeffries wrote:
On Wed, 30 Nov 2011 17:07:54 -0600, Paul Crown wrote:
Greetings,

I feel I am missing something simple.  I have installed squid3 on
Ubuntu.  I added

acl allow_domains dstdomain "/etc/squid3/always_direct.acl"
always_direct allow allow_domains

acl denied_domains dstdomain "/etc/squid3/denied_domains.acl"
http_access deny denied_domains

and populated both files accordingly, and restarted squid3.

Now from a terminal, curl good-url and it works.  curl bad-url and it
gives me the blocked message.

Try it in firefox, and good-url and bad-url both work fine.  Neither is
blocked.

What did I forget?

Thanks.

Paul
What you are missing is two details:

Firstly, http_access and always_direct are completely unrelated controls.
  - http_access determins whether Squid is allowed to service the request.
  - always_direct determines whether Squid MUST (versus MAY) service the
request using DNS lookups and going directly to the public origin
server(s).

Also, you are missing minor details about the URL being tested. ie
- whether the browse is automatically adding "www." in front of the
domain, or not
- whether curl is setting the HTTP/1.1 Host: header correctly, or not
- whether the browse and terminal tools were run on the same machine, or
not
- whether you have any other access controls affecting the requests (ie
a browser type ACL allowing Mozilla/* agents through before these controls)

Amos

Thanks Amos.

That makes sense.

I got the browser working by configuring proxy settings in the browser
to port 3128.

I was trying to do transparent interception without changing the browser
(otherwise some employees are going to change it back).   So, I am still
showing my lack of understanding regarding transparent http access.
Must I also redirect port 80 to 3128 such as in iptables to not have to
config the browser?

To not configure the browser *at all*. Yes you have to lie to the browse, with NAT or TPROXY trickery.

However, there is the WPAD protocol and PAC scripts to automatically configure the browser (and other background software!) as needed without having the bother of visiting N machines to do setup. Details are here: http://wiki.squid-cache.org/SquidFaq/ConfiguringBrowsers

This still requires the "auto detect network setting" or such config be turned on in the browser. Squid langpack now provides several error page templates explaining to end users how to configure their browser themselves and avoid admin work :) (requires a 3.1 or later proxy though).


Paul

For ref:

squid3 3.0.STABLE19-1
Ubuntu 10.04.2 LTS 64-bit

/etc/squid3/squid.conf
acl manager proto cache_object
acl localhost src 127.0.0.1/32
acl to_localhost dst 127.0.0.0/8 0.0.0.0/32
acl SSL_ports port 443
acl Safe_ports port 80		# http
acl Safe_ports port 21		# ftp
acl Safe_ports port 443		# https
acl Safe_ports port 70		# gopher
acl Safe_ports port 210		# wais
acl Safe_ports port 1025-65535	# unregistered ports
acl Safe_ports port 280		# http-mgmt
acl Safe_ports port 488		# gss-http
acl Safe_ports port 591		# filemaker
acl Safe_ports port 777		# multiling http
acl CONNECT method CONNECT
acl denied_domains dstdomain "/etc/squid3/denied_domains.acl"
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access deny denied_domains
http_access allow localhost
http_access deny all

Um, depending on how you have done your NAT rules the above http_access rules would block everything or allow everything. With almost no control in Squid.

Make sure NAT is being performed on the Squid box and the firewall rules are locked down securely to prevent other traffic arriving in the "transparent" flagged Squid port.
Details on how to do that can be found here:
http://wiki.squid-cache.org/ConfigExamples/Intercept/LinuxDnat


For traffic from configured browsers and your management access you should have a second http_port in Squid without the "transparent" flag set, for normal proxy access.

I recommend using 3128 for the normal proxy traffic since it is a well-known port (meaning attackers try and scan for access to it routinely, so a bit dangerous with all the security holes added by NAT). Dedicating some randomly chosen port number for the NAT traffic.


When this is working Squid will see all traffic arriving in the NAT port as coming from the LAN client who made the request. Which means "allow localhost" rull will not match and permit them access. You will need to alter that to "localnet" or such with the acceptible LAN subnet ranges permitted.


NP: if you have some other software before Squid handling the traffic, that is a very different setup and "transparent" is entirely the wrong way.

icp_access deny all
htcp_access deny all
http_port 3128 transparent
hierarchy_stoplist cgi-bin ?
access_log /var/log/squid3/access.log squid
refresh_pattern ^ftp:		1440	20%	10080
refresh_pattern ^gopher:	1440	0%	1440
refresh_pattern (cgi-bin|\?)	0	0%	0

We know a better pattern for this now:
  refresh_pattern -i (/cgi-bin/|\?) 0 0% 0

refresh_pattern .		0	20%	4320
icp_port 3130

You have "deny all" configured for ICP. You can set "icp_port 0" for safer configuration of the same thing.

error_directory /var/www/squid3
acl data_urls dstdomain "/etc/squid3/always_direct.acl"
always_direct allow data_urls
always_direct deny all

You don't have any cache_peer relays, so the always_direct rules which prevet such relays being used are not useful. You can remove those lines entirely.

/etc/squid3/always_direct.acl
.amazonaws.com
.google.com

/etc/squid3/denied_domains.acl
.evony.com
.myspace.com
.pogo.com
.facebook.com
.twitter.com
.zynga.com




[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux