Search squid archive

Squid Error Then Can't Connect To Web Site

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



I have an Squid proxy configured as a reverse proxy
that serves about 10 web sites via SSL.

One of the web sites randomly becomes unavailable and
the only way to make it work again is to restart the
squid process.

The following error message in the cache.log is
repeated lots of times when the problem occurs: 

2007/09/28 10:06:09| fwdNegotiateSSL: Error
negotiating SSL connection on FD 148:
error:00000000:lib(0):func(0):reason(0) (5/-1/104)
2007/09/28 10:06:09| TCP connection to 1.2.3.4/443
failed

I check the backend web server directly and it is
alive so technically the cache_peer is available, but
squid doesn't seem to think so.

The other odd item is in the access.log, normally I
would expect to see a "TCP_MISS/200" and the matching
cache_peer name at the end of the request line :

1190984166.734     32 11.12.33.44 TCP_MISS/200 280 GET
https://www.domain.com/c3.gif -
FIRST_UP_PARENT/website_ssl -

But when the problem happens, I get "TCP_MISS/503" and
NO matching cache_peer:

1190983242.734     32 11.12.33.44 TCP_MISS/503 280 GET
https://www.domain.com/c3.gif - NONE

When this event is happening, all the other web sites
are just fine.  It seems to be specific to this web
site.

I restart the squid process, and the site resumes
working.

Any ideas?

Thanks.

[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux