Search squid archive

Re: CentOS Linux 7 / squid-3.5.20-2.el7.x86_64 / LDAP / ECAP / squidGuard blacklisting

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 21/12/2016 11:24 p.m., bjoern wahl wrote:
> Hello!
> 
> Just for those who would like to have a:
> 
> Squid with Ldap user auth on an eDirectory with an ecap (watch out ! It
> is not i-cap!) virus check and squidGuard for blacklisting.
> 
> One think not working for me so far is the redirect to a virus info site
> if ecap/clamd did find a virus. By now the user is informed that the
> access was "denied" but not why. A thing i do not like with this setup
> right now. (still working on this!)

You have missed out the most important part of this tutorial...
   Where to get the eCAP adapter.


> 
> The working squid.conf looks like this:
> 
> =================================================================
> cache_mgr xxx@xxxxxxx
> http_port IPADDRESSOFSERVER:3128

Or just use the default "http_port 3128" config line provided. If you
hard-code IP addresses unnecessarily into configs you just make yourself
do extra work maintaining them.

> acl localnet src 10.0.0.0/8    # RFC1918 possible internal network
> acl localnet src 172.16.0.0/12    # RFC1918 possible internal network
> acl localnet src 192.168.0.0/16    # RFC1918 possible internal network
> acl localnet src fc00::/7       # RFC 4193 local private network range
> acl localnet src fe80::/10      # RFC 4291 link-local (directly plugged)
> machines
> acl SSL_ports port 443
> acl Safe_ports port 80        # http
> acl Safe_ports port 21        # ftp
> acl Safe_ports port 443        # https
> acl Safe_ports port 70        # gopher
> acl Safe_ports port 210        # wais
> acl Safe_ports port 1025-65535    # unregistered ports
> acl Safe_ports port 280        # http-mgmt
> acl Safe_ports port 488        # gss-http
> acl Safe_ports port 591        # filemaker
> acl Safe_ports port 777        # multiling http
> acl CONNECT method CONNECT
> auth_param basic program /usr/lib64/squid/basic_ldap_auth -b o=XXXX -h
> IPOFEDIRSERVER -D cn=XXX,o=XXX -w PASSWORDOFUSER -f
> "(&(objectclass=User)(cn=%s))"
> auth_param basic children 5
> auth_param basic realm WHATEVER-YOU-LIKE-TO-TELL-THE-USER
> auth_param basic credentialsttl 2 hours
> ecap_enable on
> loadable_modules /usr/local/lib/ecap_clamav_adapter.so
> ecap_service clamav_service_req reqmod_precache
> uri=ecap://e-cap.org/ecap/services/clamav?mode=REQMOD bypass=off
> ecap_service clamav_service_resp respmod_precache
> uri=ecap://e-cap.org/ecap/services/clamav?mode=RESPMOD bypass=on

Since bypass=on if the eCAP service has any error. (Such as finding a
virus perhapse?) The eCAP adapter will stop being used for some minutes.

If you want scanners like this to filter all traffic you need to set
bypass=off and fix any/every-thing that causes service outages.


> adaptation_access clamav_service_req allow all
> adaptation_access clamav_service_resp allow all
> acl ediruser proxy_auth REQUIRED
> http_access allow ediruser
> http_access deny all

Sigh. Whats the point of the below security rules if you are going to
bypass them completely for all traffic?

The http_access lines above this should all be down ...

> http_access deny !Safe_ports
> http_access deny CONNECT !SSL_ports
> http_access allow localhost manager
> http_access deny manager

 ... here.

At which point you will find yourself looking at two "deny all" rules in
a row. Do the obvious to fix that.

> http_access deny all
> http_port 3128
> coredump_dir /var/spool/squid
> refresh_pattern ^ftp:        1440    20%    10080
> refresh_pattern ^gopher:    1440    0%    1440
> refresh_pattern -i (/cgi-bin/|\?) 0    0%    0
> refresh_pattern .        0    20%    4320
> url_rewrite_program /usr/bin/squidGuard -c /etc/squid/squidGuard.conf
> url_rewrite_children 15
> url_rewrite_access allow all

Which is the default setting for "url_rewrite_access" directive.

And BTW, SquidGuard cannot cope with many HTTP extension request methods
in modern traffic. You will at the very least have to prevent it seeing
the CONNECT messages.

You should then realize that any users doing HTTPS can easily bypass
your SG URL mangling "control". If you are lucky right now SG will be
"blocking" HTTPS by breaking the Squid transaction on each attempt to
use it.

Otherwise what this config actually does is cause clients to send their
user credentials in clear-text across the network, while possibly
letting any client that can see and re-use anothers users credentials
create tunnels through the proxy. Hackers paradise.

Amos

_______________________________________________
squid-users mailing list
squid-users@xxxxxxxxxxxxxxxxxxxxx
http://lists.squid-cache.org/listinfo/squid-users




[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux