On 10/02/2014 10:36 a.m., Darren Breeze wrote: > Hi > > I am trying to build a squid that runs ssl_bump and icap to allow me to > write a keyword filter for the kids that will cover ssl pages. > > I have ssl_bump working and my icap client is also happily talking to squid. > > for testing currently I have the icap disabled and I am just focusing on the > ssl_bump functions. > > I have built squid as follows: > > Squid Cache: Version 3.4.3 > configure options: '--prefix=/usr' '--includedir=/usr/include' > '--datadir=/usr/share' '--bindir=/usr/sbin' '--libexecdir=/usr/lib/squid3' > '--localstatedir=/var' '--sysconfdir=/etc/squid3' '--disable-snmp' > '--enable-delay-pools' '--enable-ssl' '--enable-ssl-crtd' > '--enable-linux-netfilter' '--enable-eui' '--enable-icap-client' > '--enable-gnuregex' > > and set up the conf file as shown at the end of the message. > > when I use the proxy and go to http://news.google.com > > everything is fine and it all works OK > > when I go to https://news.google.com > > some elements (mainly graphics and thumbnails) fail to load and I get log > entries like this > > 2014/02/08 23:27:55.237| url.cc(386) urlParse: urlParse: Split URL > 'ssl.gstatic.com:443' into proto='', host='ssl.gstatic.com', port='443', > path='' > 2014/02/08 23:27:55.237| HttpHeader.cc(407) HttpHeader: init-ing hdr: > 0x9908538 owner: 2 > 2014/02/08 23:27:55.237| HttpRequest.cc(70) HttpRequest: constructed, > this=0x9908528 id=56 > 2014/02/08 23:27:55.237| Address.cc(369) lookupHostIP: Given Non-IP > 'ssl.gstatic.com': Name or service not known > 2014/02/08 23:27:55.237| HttpHeader.cc(557) parse: parsing hdr: (0x9908538) > User-Agent: Mozilla/5.0 (Windows NT 6.3; WOW64; rv:25.0) Gecko/20100101 > Firefox/25.0 > Proxy-Connection: keep-alive > Connection: keep-alive > Host: ssl.gstatic.com > This is just Squid parsing the header and determining that the request URL contains a domain name instead of an IP address. lookupHostIP should not be causing any actual DNS lookup at that stage, just a lookup to see if the OS can parse the text as an IP address number. > > if I do a lookup on the ssl.gstatic.com my local DNS (dnsmasq on the squid > host) returns a valid address straight away. > > it also fails on > > 2014/02/08 23:27:54.623| peer_select.cc(265) peerSelectDnsPaths: Find IP > destination for: news.google.com:443' via news.google.com > 2014/02/08 23:27:54.623| ipcache.cc(647) ipcache_nbgethostbyname: > ipcache_nbgethostbyname: Name 'news.google.com'. > 2014/02/08 23:27:54.623| Address.cc(369) lookupHostIP: Given Non-IP > 'news.google.com': Name or service not known > 2014/02/08 23:27:54.623| ipcache.cc(695) ipcache_nbgethostbyname: > ipcache_nbgethostbyname: MISS for 'news.google.com' > > And this is odd because it loads the page in the first place. lookupHostIP determins (again) that the name being looked up is not a raw IP address. ipcache_nbgethostbyname determines that the domain is not yet in the DNS results cache. > > I have also tried a build with --disable-internal-dns and get the same > result (but I still use the local dnsmasq) > > Does this hit any chords with anyone? So far the logs shown are the normal process of parsing a request and looking for stored DNS results. There is no sign of any DNS lookup actually being performed. Does your log mention idnsGrokResult ? that is the code parsing DNS lookup results. Amos