Search squid archive

Stuck Filtering HTTPS URL

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



I am trying to permit access to something like the following, https://www.example.com/world, without providing access to the whole site.

 

I have a basic configuration with the pertinent items as follows:

http_port 3128 ssl-bump \

  cert=/etc/squid/ssl_certs.d/myCA.pem \

  generate-host-certificates=on \

  dynamic_cert_mem_cache_size=16MB \

  options=SINGLE_DH_USE,SINGLE_ECDH_USE,NO_SSLv3,CIPHER_SERVER_PREFERENCE cipher=ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-RSA-AES128-GCM-SHA256:DHE-RSA-AES256-GCM-SHA384:DHE-RSA-AES128-GCM-SHA256:ECDHE-RSA-AES256-SHA384:ECDHE-RSA-AES128-SHA256:ECDHE-RSA-AES256-SHA:ECDHE-RSA-AES128-SHA:DHE-RSA-AES256-SHA256:DHE-RSA-AES128-SHA256:DHE-RSA-AES256-SHA:DHE-RSA-AES128-SHA:!ECDHE-RSA-DES-CBC3-SHA:!EDH-RSA-DES-CBC3-SHA:!AES256-GCM-SHA384:!AES128-GCM-SHA256:!AES256-SHA256:!AES128-SHA256:!AES256-SHA:!AES128-SHA:!DES-CBC3-SHA:HIGH:!aNULL:!eNULL:!EXPORT:!DES:!3DES:!MD5:!PSK:!RC4  sslflags=NO_SESSION_REUSE \

  tls-dh=prime256v1:/etc/squid/dhparams.d/dhparam.pem

 

sslcrtd_program /usr/lib64/squid/security_file_certgen -s /var/cache/squid/ssl_db -M 16MB

acl step1 at_step SslBump1

acl step2 at_step SslBump2

acl step3 at_step SslBump3

ssl_bump peek step1

ssl_bump bump  all

                           

acl DSTDOMAIN_ALLOW dstdomain www.example.com

acl URLPATH_ALLOW urlpath_regex -i ^/world/*

http_access allow SrcSubnet DSTDOMAIN_ALLOW URLPATH_ALLOW

note ruleid Rule-10-GCP.conf  SrcSubnet DSTDOMAIN_ALLOW URLPATH_ALLOW

note ruletype ALLOW  SrcSubnet DSTDOMAIN_ALLOW URLPATH_ALLOW

 

 

 

Dumping the log into debug mode I see that what appears that it can obtain the path but then fails the connection. If I am reading it properly, it seems to fail the SSL connection after decrypting it without passing thru any ACLs. I've tried researching the delated error message from the log. It there a better way to troubleshoot this error or should I not expect to filter a full URL via HTTPS ?

 

2019/08/12 10:40:29.053 kid1| 23,3| Uri.cc(371) parse: Split URL 'www.example.com:443' into proto='', host='www.example.com', port='443', path=''

2019/08/12 10:40:29.055 kid1| 28,5| Acl.cc(124) matches: checking DSTDOMAIN_ALLOW

2019/08/12 10:40:29.055 kid1| 28,3| DomainData.cc(110) match: aclMatchDomainList: checking 'www.example.com'

2019/08/12 10:40:29.055 kid1| 28,3| DomainData.cc(115) match: aclMatchDomainList: 'www.example.com' found

2019/08/12 10:40:29.055 kid1| 28,3| Acl.cc(151) matches: checked: DSTDOMAIN_ALLOW_1 = 1

2019/08/12 10:40:29.055 kid1| 28,5| Acl.cc(124) matches: checking URLPATH_ALLOW

2019/08/12 10:40:29.055 kid1| 28,3| Acl.cc(151) matches: checked: URLPATH_ALLOW = -1   

2019/08/12 10:40:29.055 kid1| 33,4| ServerBump.cc(26) ServerBump: will peek at www.example.com:443

2019/08/12 10:40:29.062 kid1| 83,3| Handshake.cc(497) parseSniExtension: host_name=www.example.com

….

2019/08/12 10:40:29.062 kid1| 28,3| DomainData.cc(115) match: aclMatchDomainList: 'www.example.com' found

2019/08/12 10:40:29.062 kid1| 28,3| Acl.cc(151) matches: checked: DSTDOMAIN_ALLOW = 1

2019/08/12 10:40:29.062 kid1| 28,5| Acl.cc(124) matches: checking URLPATH_ALLOW

2019/08/12 10:40:29.062 kid1| 28,3| Acl.cc(151) matches: checked: URLPATH_ALLOW = -1

2019/08/12 10:40:29.064 kid1| 33,5| client_side.cc(3023) getSslContextStart: SSL crtd request: new_certificate 2999 host=www.example.com

2019/08/12 10:40:29.065 kid1| 33,5| client_side.cc(2860) sslCrtdHandleReply: Certificate for www.example.com was successfully recieved from ssl_crtd

2019/08/12 10:40:29.081 kid1| 11,2| client_side.cc(1323) parseHttpRequest: HTTP Client REQUEST:

---------

GET /world HTTP/1.1

Host: www.example.com

User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.14; rv:68.0) Gecko/20100101 Firefox/68.0

Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8

Accept-Language: en-US,en;q=0.5

Accept-Encoding: gzip, deflate, br

DNT: 1

Connection: keep-alive

2019/08/12 10:40:29.081 kid1| 23,3| Uri.cc(371) parse: Split URL 'https://www.example.com/world' into proto='https', host='www.example.com', port='443', path='/world'

2019/08/12 10:40:29.081 kid1| 33,5| Http1Server.cc(188) buildHttpRequest: normalize 1 Host header using www.example.com

2019/08/12 10:40:29.081 kid1| 33,3| client_side.cc(641) clientSetKeepaliveFlag: http_ver = HTTP/1.1

2019/08/12 10:40:29.081 kid1| 33,3| client_side.cc(642) clientSetKeepaliveFlag: method = GET

2019/08/12 10:40:29.081 kid1| 33,4| client_side.cc(1471) quitAfterError: Will close after error: local=10. 200. 200. 200:3128 remote=10.1.2.3:64913 FD 13 flags=1

2019/08/12 10:40:29.081 kid1| 33,5| client_side.cc(1492) serveDelayedError: Responding with delated error for https://www.example.com/world

2019/08/12 10:40:29.081 kid1| 11,5| HttpRequest.cc(459) detailError: current error details: 1/0

2019/08/12 10:40:29.081 kid1| 33,5| Stream.cc(109) pullData: 0 written 0 into local=10.200. 200. 200:3128 remote=10.1.2.3:64913 FD 13 flags=1

2019/08/12 10:40:29.081 kid1| 33,5| Stream.cc(133) getNextRangeOffset: range: 0; http offset 0; reply 0

2019/08/12 10:40:29.081 kid1| 33,5| store_client.cc(319) doCopy: store_client::doCopy: co: 0, hi: 3760

2019/08/12 10:40:29.081 kid1| 33,3| Pipeline.cc(35) front: Pipeline 0x2c6cb40 front 0x2c71fc0*4

2019/08/12 10:40:29.081 kid1| 33,3| Pipeline.cc(35) front: Pipeline 0x2c6cb40 front 0x2c71fc0*4

2019/08/12 10:40:29.081 kid1| 11,2| Stream.cc(266) sendStartOfMessage: HTTP Client local=10.193.161.197:3128 remote=10.63.200.153:64913 FD 13 flags=1

2019/08/12 10:40:29.081 kid1| 11,2| Stream.cc(267) sendStartOfMessage: HTTP Client REPLY:

---------

HTTP/1.1 403 Forbidden

 

_______________________________________________
squid-users mailing list
squid-users@xxxxxxxxxxxxxxxxxxxxx
http://lists.squid-cache.org/listinfo/squid-users

[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux