Search squid archive

RE: cache-control header causes site to fail

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Ah I've found another thread with the same problem...

http://www.squid-cache.org/mail-archive/squid-users/200509/0204.html

Is the refresh pattern still the only way to fix this issue?

Thanks,

Graeme

> -----Original Message-----
> From: Graeme Bisset [mailto:gbisset@xxxxxxxxx]
> Sent: 17 November 2005 13:47
> To: squid-users@xxxxxxxxxxxxxxx
> Subject:  cache-control header causes site to fail
> 
> Hi,
> 
> I'm trying to access the site http://www.kinderopvang-plein.nl
> 
> It works without going through squid but when I do I get a 403
forbidden
> error.
> 
> I've tracked this down to the cache-control header that squid adds
(and
> more precisely the value it specifies) by running the following wget
> commands...
> 
> This command fails...
> wget --spider --header='Cache-Control: max-age=259200'
> www.kinderopvang-plein.nl
> 
> This command works...
> wget --spider --header='Cache-Control: max-age=1000'
> www.kinderopvang-plein.nl
> 
> I take it this is a problem with the web server not coping with the
> higher max-age values but is there a way to configure squid so that it
> uses smaller values by default?
> 
> Thanks in advance,
> 
> Graeme
> 
> 
> This message has been scanned for viruses by BlackSpider MailControl -
> www.blackspider.com


[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux