So, as a test I set SSL_FLAG_DONT_VERIFY_PEER, then modified ssl/support.cc sslCreateClientContext() to ignore SSL_FLAG_DONT_VERIFY_PEER to that it would always verify. Then my three test cases woks just fine. The unsigned cert is refused (although with a not very precise error message to the user), and the two valid ones work. Another quick hack: ssl_verify_cb(): disabled domain checking (so that amazon.com worked with its insufficient www.amazon.com cert. (Setting sslflags=DONT_VERIFY_DOMAIN on the http_port config line di not work..) Obviously the above hacking is not the proper solution though, should I move this conversation to the squid-dev list? What would you suggest as the next step Amos? Sean On 21 December 2011 08:36, Sean Boran <sean@xxxxxxxxx> wrote: > According to the doc, sslproxy_flags only has only one other value > NO_DEFAULT_CA. > That doesn't seem of much use... it does recognise and refuse the > expired cert though: > > 2011/12/21 07:30:01.269| Self signed certificate: > /C=--/ST=SomeState/L=SomeCity/O=SomeOrganization/OU=SomeOrganizationalUnit/CN=localhost.localdomain/emailAddress=root@localhost.localdomain > 2011/12/21 07:30:01.269| confirming SSL error 18 > 2011/12/21 07:30:01.269| fwdNegotiateSSL: Error negotiating SSL > connection on FD 29: error:14090086:SSL > routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed > (1/-1/0) > > But also refuses a well know bank: > Self signed certificate in certificate chain: > /1.3.6.1.4.1.311.60.2.1.3=CH/2.5.4.15=Private > Organization/serialNumber=CH-020.3.906.075-9/C=CH/postalCode=8001/ST=Zuerich/L=Zuerich/streetAddress=Paradeplatz > 8/O=Credit Suisse Group AG/CN=www.credit-suisse.com > 2011/12/21 07:32:47.859| confirming SSL error 19 > > And amazon: > Unable to get local issuer certificate: > /C=US/ST=Washington/L=Seattle/O=Amazon.com Inc./CN=www.amazon.com > > I had expected to mean "dont verify peer if it is the > except acl". > Hmm. > Digging in the sources, in ssl/support.cc, there are more that two > constants defined (I had just looked at the docs so far..). There is > no actual VERIFY_PEER though. > > Looking at the sources it seems necessary that > SSL_FLAG_DONT_VERIFY_PEER not be set if this is to be called: > SSL_CTX_set_verify(sslContext, SSL_VERIFY_PEER ...); > > So, compiled the lastest HEAD and tried both VERIFY_CRL, > VERIFY_CRL_ALL which would presumably have done some additional CRL > checking, but the example sites above fail on that too: > > Unable to get certificate CRL: > /C=US/ST=Washington/L=Seattle/O=Amazon.com Inc./CN=www.amazon.com > > Which would look like its requires the existence of a CRL for each destination? > Tried setting capath to an empty directory, but it probably requires > some standard CRLs. > > Squid pull its standard CA list from openssl (/etc/ssl/certs ?), but > should just accept empty crl lists if there are none? Setting > capath=/etc/ssl/certs and crlfile=/emptyfile does not help. > > I muust still be missing something.. > > > As regards The Measurement Factory, their website looks interesting, > but I dont see any relevant references. Is there a discussion or > ticket on what they are planning and how to contact them ? Should I > ask on squid-dev? > > Thanks, > > Sean > > > On 21 December 2011 01:02, Amos Jeffries <squid3@xxxxxxxxxxxxx> wrote: >> On 21/12/2011 3:34 a.m., Sean Boran wrote: >>> >>> Hi, >>> >>> sslbump allows me to interrupts ssl connections and run an AV check on >>> them. >>> It generates a certs for the target domain (via sslcrtd), so that the >>> users browser sees a server cert signed by the proxy. >>> >>> If the target domain has a certificate that is expired, or it not >>> signed by a recognised CA, its important that the lack of trust is >>> communicated to the end user. >>> >>> Example, on connecting direct (not via a proxy) to >>> https://wiki.squid-cache.org the certificated presented is expired 2 >>> years ago and not signed by known CA . >>> Noext on connecting via a sslbump proxy (v3.2.0.14), the proxy creates >>> a valid cert for wiki.squid-cache.org and in the user's browsers it >>> looks like wiki.squid-cache.org has a valid cert signed by the proxy. >>> >>> So my question is: >>> What ssl_bump settings would allow the proxy to handle such >>> destinations with expired or non trusted sites by, for example: >>> a) Not bumping the connection but piping it through to the user >>> unchanged, so the user browser notices the invalid certs? >>> b) Refuses the connection with a message to the user, if the >>> destination is not on an allowed ACL of exceptions. >> >> >> Pretty much. The Measurement Factory has a project underway to fix this >> limitation. >> Please contact Alex about sponsoring their work to make it happen faster, or >> get access to the experimental code. >> >> >>> >>> Looking at squid.conf, there is sslproxy_flags, sslproxy_cert_error >>> # TAG: sslproxy_flags >>> # DONT_VERIFY_PEER Accept certificates that fail >>> verification. >>> # NO_DEFAULT_CA Don't use the default CA list built in >>> to OpenSSL. >>> # TAG: sslproxy_cert_error >>> # Use this ACL to bypass server certificate validation errors. >>> >>> So, the following config would then implement scenario b) above? >>> >>> # Verify destinations: yes, but allow exceptions >>> sslproxy_flags DONT_VERIFY_PEER >>> #sslproxy_flags none >>> # ignore Certs with certain cites >>> acl TrustedName url_regex ^https://badcerts.example.com/ >>> sslproxy_cert_error allow TrustedName >>> sslproxy_cert_error deny all >>> >>> ==> But then, why does it not throw an error when connecting to >>> https://wiki.squid-cache.org ? >> >> >> You configured not to verify, therefore the error is not noticed and cannot >> trigger any action. >> >> Why no output is displayed you will have to ask the OpenSSL people. There >> are a few places in their API like this where errors are silently dropped >> and seemingly no way is provided to check for them externally (ie from >> Squid). >> >> Amos