On Sun, 14 Feb 2010 13:36:46 -0800 (PST), tcygne <tcygne@xxxxxxxxxxxxxxxx> wrote: > I'm using a transparent squid 2.6 and dansguardian on my network for a > little > webcaching and for filtering content. The filter is working. And the cache > is caching. Problem is, either the filter or the proxy is killing both > apt-get (on my linux boxes) and spybot SD updates (on my windows boxes). > How > can I get this traffic to pass through? How are Squid and DansGuardian chained together? how does that fit with the firewall interception rules? > > Here is my current squid.conf > http_port 3128 > #2 Lines Below To make transparent and dansguardian work > http_port 192.168.1.102:3128 transparent NP: Squid will only open 3128 once. Since the intercepted traffic only exists between your intercepting firewall and Squid there is zero reason to have two "transparent" ports open. I think you should kill the line above and correct your firewall rules to only send traffic to 192.168.1.102:8080. > http_port 192.168.1.102:8080 transparent > redirect_rewrites_host_header off > cache_replacement_policy lru > acl all src all > acl localnet src 192.168.0.0/255.255.0.0 acl localnet src 192.168.0.0/16 > acl localhost src 127.0.0.1 > acl to_localhost dst 127.0.0.0/8 0.0.0.0/8 > acl Safe_ports port 80 81 443 210 119 70 21 1025-65535 > acl SSL_Ports port 443 > acl CONNECT method CONNECT > http_access deny !Safe_ports > http_access deny CONNECT !SSL_Ports > http_access allow localnet > http_access allow localhost > http_access deny all > icp_port 0 > refresh_pattern \.jpg$ 3600 50% 60 > refresh_pattern \.gif$ 3600 50% 60 > refresh_pattern \.css$ 3600 50% 60 > refresh_pattern \.js$ 3600 50% 60 > refresh_pattern \.html$ 300 50% 10 NP: just on the side. With the abundance of sites passing query strings its well worth adjusting those file matches to cope with the "?.*" URL parts. ie your .html match becomes: refresh_pattern \.html(\?.*)?$ 300 50% 10 Amos