Search squid archive

Is there any way to cache or forward https requests to an http proxy using Squid?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



I currently have squid setup to use a self-signed certificate for MITM to
cache HTTPS requests. This works. If an item is not in the cache I want to
request from an online proxy like Crawlera. Unfortunately Crawlera only
offer an http endpoint. When I try to forward to this endpoint, everything
works for HTTP, but for HTTPS I received the error: Handshake with SSL
server failed: error:140770FC:SSL routines:SSL23_GET_SERVER_HELLO:unknown
protocol

I'm using squid 4.2. Is there a way I can configure squid so I can specify
it as a proxy for an https request and then have it act as a cache or
forward to an HTTP proxy (that supports CONNECT)? If at some point I'm
transmitting in plain text it doesn't matter at all for this application.

The following is my configuration for Squid:

http_port 3128 ssl-bump \
  cert=/apps/server_crt.pem key=/apps/server_key.pem \
  generate-host-certificates=on dynamic_cert_mem_cache_size=4MB
sslcrtd_program /apps/squid/libexec/security_file_certgen -s
/apps/squid/var/lib/ssl_db -M 4MB sslcrtd_children 8 startup=1 idle=1 
acl step1 at_step SslBump1
ssl_bump peek step1
ssl_bump bump all
acl localnet src 10.0.0.0/8     # RFC1918 possible internal network
acl localnet src 172.16.0.0/12  # RFC1918 possible internal network
acl localnet src 192.168.0.0/16 # RFC1918 possible internal network
acl localnet src fc00::/7       # RFC 4193 local private network range
acl localnet src fe80::/10      # RFC 4291 link-local (directly plugged)
machines
acl SSL_ports port 443
acl Safe_ports port 80          # http
acl Safe_ports port 21          # ftp
acl Safe_ports port 443         # https
acl Safe_ports port 70          # gopher
acl Safe_ports port 210         # wais
acl Safe_ports port 280         # http-mgmt
acl Safe_ports port 488         # gss-http
acl Safe_ports port 591         # filemaker
acl Safe_ports port 777         # multiling http
acl Safe_ports port 1025-65535  # unregistered ports
acl CONNECT method CONNECT
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow localhost manager
http_access deny manager
coredump_dir /apps/squid/var/cache
maximum_object_size 10 GB
cache_dir ufs /apps/squid/var/cache/squid 100 16 256
cache_mem 256 MB
maximum_object_size_in_memory 512 KB
cache_replacement_policy heap LFUDA
range_offset_limit -1
quick_abort_min -1 KB
offline_mode on
http_access allow localnet
http_access allow localhost
http_access deny all
refresh_pattern . 525600 100% 525600 ignore-reload ignore-no-store
ignore-private ignore-auth ignore-must-revalidate store-stale

cache_peer proxy.crawlera.com parent 8010 0 ssl login=APIKEY:
never_direct allow all

Update


If I change the ssl_bump directives above to the following:

acl step1 at_step SslBump1
acl step2 at_step SslBump2
acl step3 at_step SslBump3

ssl_bump stare step2
ssl_bump bump step3

An HTTPS request will tunnel all the way through both proxies to the target
and correctly return the response to the caller, but it no longer has MITM
access at the Squid proxy to cache the results, so they CONNECT though to
Crawlera on subsequent requests for the same resource. HTTP on the other
hand will go through both proxies if it's not in the cache, otherwise it
does get returned from the cache.

This is still not the solution I'm looking for though, I would like to cache
HTTPS as well.



--
Sent from: http://squid-web-proxy-cache.1019090.n4.nabble.com/Squid-Users-f1019091.html
_______________________________________________
squid-users mailing list
squid-users@xxxxxxxxxxxxxxxxxxxxx
http://lists.squid-cache.org/listinfo/squid-users




[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux