Search squid archive

Re: Squid applying acls differently when transparent & non transparent proxy

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 5/09/2013 2:40 a.m., Andrew Wood wrote:
My Squid proxy which is being used to prevent access to inappropriate sites and to display a session splash / AUP page to public visitors on the public wifi VLAN subnet works great when transparently intercepting traffic via NAT/ iptables but intermittently fails to block stuff when the client is set to explicitly use the proxy. Does Squid see the source or dest Ip differently in this case?


Is it possible to block squid from accepting stuff which hasnt been transparently intercepted so clients cant manually set the proxy to circumvent the acls?

If I block non transparently intercepted traffic i have a further issue...
I need to allow https through squid somehow and as I understand there are 3 ways to do it:

1. Transparently intercept port 443 with Bump client first man in the middle

2. Configure clients to explicitly use the proxy for https via a CONNECT tunnel

3. Transparently intercept port 443 with bump server first & dynamic certificate generation

Option 1 is ruled out as visitors will be spooked by the browser warnings

Option 2 requires the client to be explicitly configured, which with BYOD   means a PAC file set via DHCP or DNS, but this is problematic with many browsers and means Squid will need to accept non transparently intercepted traffic and as mentioned at the start this is causing problems with the acls

Option 3 is promising but how transparent is the dynamic cert generation? Do browsers still need to be configured to accept our gateway as a CA or is the remote server cert passed through verbatim?


Hope this makes sense Ive experimented with many things but its looking increasinly like im going to have to block non intercepted stuff (how?) and go with option 3

Option #1 is recommended against for some far more serious reasons than just client annoyance: when the server certificate is hijacked or just plain broken client-first bumping erases the ability for client-side validation and appropriate handling or acceptance by the client. All they see is a "working "website - even if working means some attacker (other than your Squid which is an attacker in its own right) is getting full access to their data.

Regarding option 3 (and option 2 with server-first bumping); the certificate generation can be non-transparent when just the generator is used or very transparent if you use http://wiki.squid-cache.org/Features/MimicSslServerCert as well.

You can treat port 443 to the same rules recommended for port 80 protection. Are you using iptables?

Amos





[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux