Search squid archive

Re: RE: Fwd: Fwd: SSLBUMP Issue with SSL websites

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 10.07.2012 00:58, Muhammed Shehata wrote:
Hi All,
    hope you all are doing well
actually I followed the replies on squid users-mail-list about sslbump issues with showing up some websites inline without images or css style sheet
like https://gmail.com and https://facebook.com
I do use broken sites acl  to exclude them
however I see that method is CONNECT for those excluded website not Get as all other bumped sites but it still the same result

All sites, bumped or not should be logging a CONNECT request. We are still sorting out exactly how the bumped ones gets displayed though.

For sites which you *prevent* bumping (broken_sites), of course you will never see the GET. The GET is *inside* the encrypted portion of the request which "ssl_bump deny ..." prevents Squid decrypting.


1341837646.893 45801 x.x.x.x TCP_MISS/200 62017 CONNECT twitter.com:443 - DIRECT/199.59.150.7
I'm using squid 3.1.19
acl broken_sites dstdomain .twitter.com
acl broken_sites dstdomain .facebook.com
ssl_bump deny broken_sites
ssl_bump allow all
http_port 192.168.0.1:3128 ssl-bump generate-host-certificates=on dynamic_cert_mem_cache_size=40MB cert=/etc/pki/tls/certs/sslintercept.crt key=/etc/pki/tls/certs/sslintercept.key



Both sites include embeded objects (images, scripts, CSS) using domains which never appear in the browser address bar. You may need to add those alternative domains into your broken_sites ACL to keep the same behaviour for the main HTML portion of the site and its other objects.


Other issues:

Twitter has some serious cacheability problems, and a number of smaller annoying bugs affecting proxy responses:

  http://redbot.org/?descend=True&uri=https://twitter.com/

 * A gzip-compressed response was sent when it wasn't asked for.
 * The resource doesn't send Vary consistently.
 * The ETag doesn't change between negotiated representations.
 * There was a problem checking for Last-Modified validation support.
 * A ranged request returned another representation.


Facebook has improved noticably over the last year or so, but the main site still has one major Vary issue that breaks cacheability:

  http://redbot.org/?descend=True&uri=https://www.facebook.com/

* This response is negotiated, but doesn't have an appropriate Vary header.
 * The response body is different when content negotiation happens.
* Content negotiation for gzip compression makes the response 19% larger.


These problems are operating on both the HTTP and HTTPS accesses to the site.

Amos



[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux