Search squid archive

Re: remote 403 error through squid

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Sun, 11 Sep 2005, Merton Campbell Crockett wrote:

Is there an inverse max-age dependency?  The behaviour of the VATLogic and
Mufreesboro web sites occurs regardless of max-age.  Both sites return a
403 (Forbidden) status when the URL references DocumentRoot.

For me VATLogic shows the exact same symptoms. With a high max-age 403 is returned. Low max-age, none or combined with Pragma: no-cache works fine.

There may be an inverse max-age dependency but in these two instances I
suspect that it is a "red-herring".  There is a simpler answer.  Access is
being denied because the request appears to be attempting to retrieve a
directory listing.

I have no problem reaching this site directly, only via Squid. Also works fine via Squid if the refresh_pattern workaround metioned earlier is used.


Works:

squidclient -h www.vatlogic.com -p 80 -H "Host: www.vatlogic.com\n" /

squidclient -h www.vatlogic.com -p 80 -H "Host: www.vatlogic.com\nCache-Control: max-age=0\n" /

squidclient -h www.vatlogic.com -p 80 -H "Host: www.vatlogic.com\nCache-Control: max-age=20000\n" /

Fails:

squidclient -h www.vatlogic.com -p 80 -H "Host: www.vatlogic.com\nCache-Control: max-age=200000\n" /

Works:

squidclient -h www.vatlogic.com -p 80 -H "Host: www.vatlogic.com\nCache-Control: max-age=20000\nPragma: no-cache\n" /



If you try different max-age values you will find that the limit increases by the second.


On all three sites the pattern is the same, and only applies to "directory" URLs. If the URL is modified in any manner (query parameters, or even by ending in a double //) the problem is not triggered.


Regards
Henrik

[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux