Search squid archive

Re: Trouble filtering/denying HTTPS traffic

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 18 October 2012 08:58, Amos Jeffries <squid3@xxxxxxxxxxxxx> wrote:
> On 18.10.2012 01:03, Marcus Kool wrote:
>>
>> On 10/17/2012 02:18 AM, Amos Jeffries wrote:
>>>
>>> On 17/10/2012 4:08 p.m., Cameron Charles wrote:
>>>>
>>>> Hi all,
>>>>
>>>> I am currently trying to setup basic "url/domain level" filtering on
>>>> HTTPS traffic using an external acl, i can see clearly in the access
>>>> log that the information i require is there and the external acl finds
>>>> and filters it as desired, returning the correct response for
>>>> deny/allow and i can successfully browse https sites that are allowed,
>>>> however sites that deny_info should redirect to the error page fail
>>>> and only a browser based error is returned, the error is as follows...
>>>
>>>
>>> Two datums you need to be aware of ... (re-ordered your listed facts so
>>> the explanation makes sense)
>>>
>>>> For the failed denies the access.log shows the following (here trying
>>>> https version of facebook)
>>>> 1350442727 17/Oct/2012-13:58:47-EST 770 10.0.1.103 TCP_DENIED
>>>> 307 408 CONNECT www.facebook.com:443 student1-2008 - text/html
>>>> A sucessful https browse to an allowed site looks like the following
>>>> 1350442986 17/Oct/2012-14:03:06-EST 9058 10.0.1.103 TCP_MISS 200
>>>> 24489 CONNECT play.google.com:443 student1-2008 play.google.com
>>>
>>>
>>> ... 1) these are CONNECT requests. They are not HTTPS nor are the
>>> resulting tunnels necessarily containing HTTPS requests even if they are
>>> going to port 443.
>>>
>>> They simply tell Squid to open a TCP connection to the named server and
>>> port. Just a TCP connection.
>>>
>>> This being Chrome you are using it is more likely that they are going to
>>> send SPDY protocol than HTTPS - but either one or somethine else entirely
>>> might result from that tunnel. It depends on things
>>> outside Squids control and knowledge what the client and server negotiate
>>> between themselves with the packets going through it *after* CONNECT setup.
>>>
>>>
>>>> in firefox this is all that is displayed:
>>>> Unable to connect - Firefox can't establish a connection to the
>>>> server at www.facebook.com.
>>>> Google is a little more descriptive giving this error:
>>>> Error 111 (net::ERR_TUNNEL_CONNECTION_FAILED): Unknown error.
>>>
>>>
>>> ... 2) this is Chromes way of reporting to the user that something
>>> (anything!) other than complete end-to-end success happened. Friendly no?
>>>
>>> Squid successfully performed the checks and deny_info redirection
>>> (TCP_DENIED/307 got logged), but Chrome is not handling the 307 status in
>>> any useful way.
>>
>>
>> This is not just Chrome.  All modern versions of MSIE/Chrome/Firefox
>> give an error like "cannot connect" or "proxy refusing connection".
>> It does not matter what HTTP error code Squid sends to the browser
>> since the browsers ignore the returned HTTP-based error messages
>> when sending a CONNECT and simply complain with "cannot connect".
>> FYI: old versions (4.x) of firefox did accept an HTTP error message
>> but the latest ones do not.
>>
>> Marcus
>
>
> This is NOT about 4xx/5xx error though. The reasonable arguments against
> displaying 4xx/5xx (including auth login pages) are all about how they are
> to display the information that the address bar and the displayed page are
> not related. When they do it is trivial to break security and foll the users
> who see an attacked sites login https:// URL in the address bar and a fake
> login page from some intermediary posting the credentials back to attackers
> server.
>
> This is specifically Chrome ("Error 111 (net::ERR_TUNNEL_CONNECTION_FAILED):
> Unknown error.") responding to a 307 redirection.
>
> What they are supposed to do is present a popup asking teh user if they are
> allowed to repeat with the new location.
> RFC 2616:
> "If the 307 status code is received in response to a request other
>    than GET or HEAD, the user agent MUST NOT automatically redirect the
>    request unless it can be confirmed by the user, since this might
>    change the conditions under which the request was issued."
>
> Instead they just bark some obscure code syntax at the user instead.
>
>
> Cameron, I'm reminded of another method. Have you tried 303 code instead?
> That one specifically tells the browser to change to GET and fetch as a
> regular page. Useful for portal login pages and error display pages for
> non-GET reuqests.
>
> Amos


Amos, i just gave the 303 code a try using the following deny_info line
        deny_info 303:http://10.0.1.26/policy/denied/request/%A
request_policy_check_acl

which the access log notes with the following
      1350516342 18/Oct/2012-10:25:42-EST    275 10.0.1.103 TCP_DENIED
303 398 CONNECT www.youtube.com:443 student1-2008 - text/html

Same error/outcome in the browser.

Cameron


[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux