Hey Usama, There are more missing details on the system. If you provide the OS and squid details I might be able to provide a script that will pull most of the relevant details on the system. I don’t know about this specific issue yet and it seems like there is a SSL related issue and it might not be even related to Squid. (@Alex Or @Chrisots might know better then me) All The Bests, ---- Eliezer Croitoru NgTech, Tech Support Mobile: +972-5-28704261 Email: ngtech1ltd@xxxxxxxxx From: squid-users <squid-users-bounces@xxxxxxxxxxxxxxxxxxxxx> On Behalf Of Usama Mehboob Sent: Thursday, February 24, 2022 23:45 To: squid-users@xxxxxxxxxxxxxxxxxxxxx Subject: Getting SSL Connection Errors Hi I have a squid running on a linux box ( about 16GB ram and 4 cpu ) -- it runs fine for the most part but when I am launching multiple jobs that are connecting with salesforce BulkAPI, sometimes connections are dropped. its not predictable and happens only when there is so much load on squid. Can anyone shed some light on this? what can I do? is it a file descriptor issue? I see only these error messages from the cache logs ``` PeerConnector.cc(639) handleNegotiateError: Error (error:04091068:rsa routines:INT_RSA_VERIFY:bad signature) but, hold write on SSL connection on FD 109 ``` ----------------Config file ---------------- visible_hostname squid
# # Recommended minimum configuration: #
# Example rule allowing access from your local networks. # Adapt to list your (internal) IP networks from where browsing # should be allowed acl localnet src 10.0.0.0/8 # RFC1918 possible internal network acl localnet src 172.16.0.0/12 # RFC1918 possible internal network acl localnet src 192.168.0.0/16 # RFC1918 possible internal network acl localnet src fc00::/7 # RFC 4193 local private network range acl localnet src fe80::/10 # RFC 4291 link-local (directly plugged) machines
acl SSL_ports port 443 acl Safe_ports port 80 # http ###acl Safe_ports port 21 # ftp testing after blocking itp acl Safe_ports port 443 # https acl Safe_ports port 70 # gopher acl Safe_ports port 210 # wais acl Safe_ports port 1025-65535 # unregistered ports acl Safe_ports port 280 # http-mgmt acl Safe_ports port 488 # gss-http acl Safe_ports port 591 # filemaker acl Safe_ports port 777 # multiling http acl CONNECT method CONNECT
# # Recommended minimum Access Permission configuration: # # Deny requests to certain unsafe ports http_access deny !Safe_ports
# Deny CONNECT to other than secure SSL ports http_access deny CONNECT !SSL_ports #http_access allow CONNECT SSL_ports
# Only allow cachemgr access from localhost http_access allow localhost manager http_access deny manager
# We strongly recommend the following be uncommented to protect innocent # web applications running on the proxy server who think the only # one who can access services on "localhost" is a local user #http_access deny to_localhost
# # INSERT YOUR OWN RULE(S) HERE TO ALLOW ACCESS FROM YOUR CLIENTS #
# Example rule allowing access from your local networks. # Adapt localnet in the ACL section to list your (internal) IP networks # from where browsing should be allowed
# And finally deny all other access to this proxy
# Squid normally listens to port 3128 #http_port 3128 http_port 3129 intercept https_port 3130 cert=/etc/squid/ssl/squid.pem ssl-bump intercept http_access allow SSL_ports #-- this allows every https website acl step1 at_step SslBump1 acl step2 at_step SslBump2 acl step3 at_step SslBump3 ssl_bump peek step1 all
# Deny requests to proxy instance metadata acl instance_metadata dst 169.254.169.254 http_access deny instance_metadata
# Filter HTTP Only requests based on the whitelist #acl allowed_http_only dstdomain .veevasourcedev.com .google.com .pypi.org .youtube.com #acl allowed_http_only dstdomain .amazonaws.com #acl allowed_http_only dstdomain .veevanetwork.com .veevacrm.com .veevacrmdi.com .veeva.com .veevavault.com .vaultdev.com .veevacrmqa.com #acl allowed_http_only dstdomain .documentforce.com .sforce.com .force.com .forceusercontent.com .force-user-content.com .lightning.com .salesforce.com .salesforceliveagent.com .salesforce-communities.com .salesforce-experience.com .salesforce-hub.com .salesforce-scrt.com .salesforce-sites.com .site.com .sfdcopens.com .sfdc.sh .trailblazer.me .trailhead.com .visualforce.com
# Filter HTTPS requests based on the whitelist acl allowed_https_sites ssl::server_name .pypi.org .pythonhosted.org .tfhub.dev .gstatic.com .googleapis.com acl allowed_https_sites ssl::server_name .amazonaws.com acl allowed_https_sites ssl::server_name .documentforce.com .sforce.com .force.com .forceusercontent.com .force-user-content.com .lightning.com .salesforce.com .salesforceliveagent.com .salesforce-communities.com .salesforce-experience.com .salesforce-hub.com .salesforce-scrt.com .salesforce-sites.com .site.com .sfdcopens.com .sfdc.sh .trailblazer.me .trailhead.com .visualforce.com ssl_bump peek step2 allowed_https_sites ssl_bump splice step3 allowed_https_sites ssl_bump terminate step2 all
connect_timeout 60 minute read_timeout 60 minute write_timeout 60 minute request_timeout 60 minute
## http filtering ### #http_access allow localnet allowed_http_only #http_access allow localhost allowed_http_only http_access allow localnet allowed_https_sites http_access allow localhost allowed_https_sites # And finally deny all other access to this proxy http_access deny all
# Uncomment and adjust the following to add a disk cache directory. #cache_dir ufs /var/spool/squid 100 16 256
# Leave coredumps in the first cache dir coredump_dir /var/spool/squid
# # Add any of your own refresh_pattern entries above these. # refresh_pattern ^ftp: 1440 20% 10080 refresh_pattern ^gopher: 1440 0% 1440 refresh_pattern -i (/cgi-bin/|\?) 0 0% 0 refresh_pattern . 0 20% 4320
thanks |
_______________________________________________
squid-users mailing list
squid-users@xxxxxxxxxxxxxxxxxxxxx
http://lists.squid-cache.org/listinfo/squid-users