Search squid archive

404s' not cached despite being cache-able

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi.

I have this url:http://bmtu.livedefinition.com/context/1001/homepage
which is 404 , but it has max-age and expires.
I've checked the headers with redbot, and it seems to agree the page
is cachable:
http://bmtu.livedefinition.com/context/1001/homepage

Still, I only get MISSs' from squid
ubuntu@ip-10-0-0-144:~$ curl -vv -x localhost:3128
http://bmtu.livedefinition.com/context/1001
* About to connect() to proxy localhost port 3128 (#0)
*   Trying 127.0.0.1... connected
* Connected to localhost (127.0.0.1) port 3128 (#0)
> GET http://bmtu.livedefinition.com/context/1001 HTTP/1.1
> User-Agent: curl/7.21.3 (i686-pc-linux-gnu) libcurl/7.21.3 OpenSSL/0.9.8o zlib/1.2.3.4 libidn/1.18
> Host: bmtu.livedefinition.com
> Accept: */*
> Proxy-Connection: Keep-Alive
>
* HTTP 1.0, assume close after body
< HTTP/1.0 404 Not Found
< Cache-Control: max-age=3600
< Date: Sat, 24 Mar 2012 20:47:13 GMT
< Expires: Sat, 24 Mar 2012 21:47:13 GMT
< Server: Jetty(6.1.25)
< Content-Length: 27
< X-Cache: MISS from localhost
< X-Cache-Lookup: MISS from localhost:80
< Via: 1.0 localhost (squid/3.1.11)
* HTTP/1.0 connection set to keep alive!
< Connection: keep-alive

Here's the access log related:
1332622033.136      6 127.0.0.1 TCP_MISS/404 331 GET
http://bmtu.livedefinition.com/context/1001 - DIRECT/176.34.129.228 -

The config is default, and the squid version is 3.1.11.

How can I force squid to cache those specific 404s'?



[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux