Drew Wrobel wrote:
I have a squid 2.7 stable 7 caching server in front of my apache web servers. Every now and then when I hit the main page of the company's web site, I get an Apache 500 Forbidden page. If I hit refresh, I get right in. The following is from the squid access log: www.company.com 172.21.84.170 - - [14/Jan/2010:14:28:26 -0500] "GET http://172.21.100.66/ HTTP/1.1" 403 584 "-" "Mozilla/5.0 (Macintosh; U; PPC Mac OS X 10.4; en-US; rv:1.9.1.7) Gecko/20091221 Firefox/3.5.7" TCP_NEGATIVE_HIT:NONE www.company.com 172.21.84.170 - - [14/Jan/2010:14:28:36 -0500] "GET http://172.21.100.66/ HTTP/1.1" 200 27034 "-" "Mozilla/5.0 (Macintosh; U; PPC Mac OS X 10.4; en-US; rv:1.9.1.7) Gecko/20091221 Firefox/3.5.7" TCP_MISS:ROUNDROBIN_PARENT Is apache generated the 403 return code or squid?
Apache generated it. "TCP_NEGATIVE_HIT" is Squid serving a saved error reply out of the cache
Check that you have negative_ttl set to 0 seconds. If it already is, the problem will be in the HTTP headers indicating how long that 403 can be stored.
I know a change was made to the web servers to address problems with rogue search engine robot/spiders. There was an entry put into .htaccess to block any vistors where the User Agent string is blank. Could that be causing a problem? If I'm reading the log correct, the User Agent string is defined, but the Referrer is blank. Here is a copy of from the squid log where BOTH the Referrer and User Agent string are blank ( but we are only looking for just the User Agent string being blank ) www.company.com 68.142.243.85 - - [14/Jan/2010:05:47:24 -0500] "GET http://172.21.100.66/ HTTP/1.1" 403 576 "-" "-" TCP_MISS:ROUNDROBIN_PARENT
This "TCP_MISS" is apache generating the page to start with. Amos -- Please be using Current Stable Squid 2.7.STABLE7 or 3.0.STABLE21 Current Beta Squid 3.1.0.15