Search squid archive

cache-control header causes site to fail

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi,

I'm trying to access the site http://www.kinderopvang-plein.nl

It works without going through squid but when I do I get a 403 forbidden
error.

I've tracked this down to the cache-control header that squid adds (and
more precisely the value it specifies) by running the following wget
commands...

This command fails...
wget --spider --header='Cache-Control: max-age=259200'
www.kinderopvang-plein.nl

This command works...
wget --spider --header='Cache-Control: max-age=1000'
www.kinderopvang-plein.nl

I take it this is a problem with the web server not coping with the
higher max-age values but is there a way to configure squid so that it
uses smaller values by default?

Thanks in advance,

Graeme


This message has been scanned for viruses by BlackSpider MailControl - www.blackspider.com


[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux