Search squid archive

Re: block user agent

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



________________________________
From: Amos Jeffries <squid3@xxxxxxxxxxxxx>
>
> If you place that after the default "deny CONNECT !SSL_ports", and 
> before your UA checks, AND if you are using ssl_bump on the allowed 
> tunnels then you can relatively safely use "allow CONNECT".
> 
> Just be careful that the CONNECT allowed by that are always handled 
> safely by the ssl_bump rules you have.
>   Meaning that you either bump or terminate traffic you are not sure is 
> okay, splice if you are reasonably sure, etc. it is a balancing effort 
> between "splice as much as possible" and "terminate if unsure of the 
> traffic" advice.


As you say, I placed "allow CONNECT" after the default "deny CONNECT !SSL_ports", and before my UA checks. I'm also using:
ssl_bump stare all
ssl_bump bump all


Considering the following (taken from previous e-mail):

http_access deny intercepted !localnet
http_access deny interceptedssl !localnet
http_access deny explicit !ORG_all
http_access deny explicit SSL_ports

Would it be "safer" or "indifferent" to use the following right before the UA checks?

http_access allow CONNECT interceptedssl SSL_ports


> Just FYI you would be a huge amount better off dropping the UA 
> fingerprinting. It's a _really_ simplistic idea about the HTTP world, 
> and it is partly because of that overly-simplistic nature and depending 
> on unreliable values that you are having so much more trouble than 
> normal admin face.


I'm aware that UA checks are not fully reliable, but in a big corporate environment it can reveal a lot of interested information.

I also know that some HTTP clients mimic others' user-agent strings or substrings. They can even sometimes dynamically change them.

However, in my particular case I could define a custom UA for our corporate browser allowed to go through Squid. For instance, Firefox can easily do that. Other browsers such as Edge seem not to.
In any case, it is not my intention to do so long-term. In short-term I found out that:

1) Squid logic *can* be understood :-)

2) some hosts may have HTTP clients that should be blocked even though the rest of the Squid rules were not programmed for that (so I couldn't know about it). A simple example: we may allow traffic to all microsoft sites, but some software may not necessarily be well installed/configured. I found that Microsoft Office may connect to an MS site to download or update software with a utility/service called 
OfficeClickToRun. Of course, generic rules in Squid.conf already blocked unauthorized downloads according to mimetypes or filetypes. However, some clients could be whitelisted and allowed to download (eg. from all MS sites). In this case, I would not necessarily want OfficeClickToRun to update. That could be done by identifying the dst domains, but they could change in time, and in any case would require more digging into. 


Adobe has similar http client behavior.


Anyway, it's informative to say the least, and can be used to improve the rest of the "standard" squid acl access rules.

I was also thinking of using custom HTTP headers such as X-MyCustomHeader: Whatever instead of UA strings. Custom headers can easily be added in Firefox, and other browsers such as Edge also seem to support that.

Anyway, I had a great time fiddling with Squid.
Thank you for your assistance.

Vieri
_______________________________________________
squid-users mailing list
squid-users@xxxxxxxxxxxxxxxxxxxxx
http://lists.squid-cache.org/listinfo/squid-users




[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux