Err, my mail client ate my formatting -- that refresh_pattern is all on one line in my squid.conf file. Also, FYI, these logs are all for squid straight from the ubuntu repos, but I saw the same problem when compiling from source with the latest stable version (3.1.10). On Thu, Dec 30, 2010 at 1:37 PM, Chas Williams <chas@xxxxxxxxxx> wrote: > Hi Curl, > > I'm trying configure squid to cache everything for a minimum > amount of time, no matter what (I release this is a bit of a > bastardization of it's purpose and of the http spec -- I'm trying to > use it as a cache for another program I'm working on, in a space where > this is reasonable). > > I can't seem to make negative caching work -- below are (1) the > send and receive headers from a curl request for a URL that 404s, > twice (to show that the second is a cache miss), (2) version > information (of squid and my box), (3) my squid.conf file, and (4) my > access, cache, and store logs. These are all from a fresh install on a > new box. > > Any ideas? > > Thanks, > > Chas Williams > > P.S. for bonus points, can anyone tell me if my squid.conf will really > cache everything for at least 240 minutes? > > ---------------------------------------------------------------------------------------------------------------------------- > ubuntu@domU-12-31-39-0C-81-A1:~$ curl -x localhost:3128 -v > "http://hotelburst.es/robots.txt" >/dev/null > * About to connect() to proxy localhost port 3128 (#0) > * Trying 127.0.0.1... % Total % Received % Xferd Average Speed > Time Time Time Current > Dload Upload Total Spent Left Speed > 0 0 0 0 0 0 0 0 --:--:-- --:--:-- > --:--:-- 0connected > * Connected to localhost (127.0.0.1) port 3128 (#0) >> GET http://hotelburst.es/robots.txt HTTP/1.1 >> User-Agent: curl/7.21.0 (x86_64-pc-linux-gnu) libcurl/7.21.0 OpenSSL/0.9.8o zlib/1.2.3.4 libidn/1.18 >> Host: hotelburst.es >> Accept: */* >> Proxy-Connection: Keep-Alive >> > * HTTP 1.0, assume close after body > < HTTP/1.0 404 Not Found > < Date: Thu, 30 Dec 2010 21:18:14 GMT > < Server: Apache/2.2.3 (Debian) mod_jk/1.2.18 mod_python/3.2.10 > Python/2.4.4 PHP/5.2.0-8+etch15 mod_ssl/2.2.3 OpenSSL/0.9.8c > mod_perl/2.0.2 Perl/v5.8.8 > < Content-Length: 406 > < Content-Type: text/html; charset=iso-8859-1 > < X-Cache: MISS from localhost > < X-Cache-Lookup: MISS from localhost:3128 > < Via: 1.0 localhost (squid/3.1.6) > * HTTP/1.0 proxy connection set to keep alive! > < Proxy-Connection: keep-alive > < > { [data not shown] > 100 406 100 406 0 0 1303 0 --:--:-- --:--:-- > --:--:-- 1309* Connection #0 to host localhost left intact > * Closing connection #0 > ubuntu@domU-12-31-39-0C-81-A1:~$ curl -x localhost:3128 -v > "http://hotelburst.es/robots.txt" >/dev/null > * About to connect() to proxy localhost port 3128 (#0) > * Trying 127.0.0.1... % Total % Received % Xferd Average Speed > Time Time Time Current > Dload Upload Total Spent Left Speed > 0 0 0 0 0 0 0 0 --:--:-- --:--:-- > --:--:-- 0connected > * Connected to localhost (127.0.0.1) port 3128 (#0) >> GET http://hotelburst.es/robots.txt HTTP/1.1 >> User-Agent: curl/7.21.0 (x86_64-pc-linux-gnu) libcurl/7.21.0 OpenSSL/0.9.8o zlib/1.2.3.4 libidn/1.18 >> Host: hotelburst.es >> Accept: */* >> Proxy-Connection: Keep-Alive >> > * HTTP 1.0, assume close after body > < HTTP/1.0 404 Not Found > < Date: Thu, 30 Dec 2010 21:18:16 GMT > < Server: Apache/2.2.3 (Debian) mod_jk/1.2.18 mod_python/3.2.10 > Python/2.4.4 PHP/5.2.0-8+etch15 mod_ssl/2.2.3 OpenSSL/0.9.8c > mod_perl/2.0.2 Perl/v5.8.8 > < Content-Length: 406 > < Content-Type: text/html; charset=iso-8859-1 > < X-Cache: MISS from localhost > < X-Cache-Lookup: HIT from localhost:3128 > < Via: 1.0 localhost (squid/3.1.6) > * HTTP/1.0 proxy connection set to keep alive! > < Proxy-Connection: keep-alive > < > { [data not shown] > 100 406 100 406 0 0 3992 0 --:--:-- --:--:-- > --:--:-- 4060* Connection #0 to host localhost left intact > * Closing connection #0 > --------------------------------------------------------------------------------------------------------------- > ubuntu@domU-12-31-39-0C-81-A1:~$ squid3 -v > Squid Cache: Version 3.1.6 > configure options: '--build=x86_64-linux-gnu' '--prefix=/usr' > '--includedir=${prefix}/include' '--mandir=${prefix}/share/man' > '--infodir=${prefix}/share/info' '--sysconfdir=/etc' > '--localstatedir=/var' '--libexecdir=${prefix}/lib/squid3' > '--disable-maintainer-mode' '--disable-dependency-tracking' > '--disable-silent-rules' '--srcdir=.' '--datadir=/usr/share/squid3' > '--sysconfdir=/etc/squid3' '--mandir=/usr/share/man' > '--with-cppunit-basedir=/usr' '--enable-inline' '--enable-async-io=8' > '--enable-storeio=ufs,aufs,diskd' '--enable-removal-policies=lru,heap' > '--enable-delay-pools' '--enable-cache-digests' '--enable-underscores' > '--enable-icap-client' '--enable-follow-x-forwarded-for' > '--enable-auth=basic,digest,ntlm,negotiate' > '--enable-basic-auth-helpers=LDAP,MSNT,NCSA,PAM,SASL,SMB,YP,DB,POP3,getpwnam,squid_radius_auth,multi-domain-NTLM' > '--enable-ntlm-auth-helpers=smb_lm,' > '--enable-digest-auth-helpers=ldap,password' > '--enable-negotiate-auth-helpers=squid_kerb_auth' > '--enable-external-acl-helpers=ip_user,ldap_group,session,unix_group,wbinfo_group' > '--enable-arp-acl' '--enable-esi' '--disable-translation' > '--with-logdir=/var/log/squid3' '--with-pidfile=/var/run/squid3.pid' > '--with-filedescriptors=65536' '--with-large-files' > '--with-default-user=proxy' '--enable-linux-netfilter' > 'build_alias=x86_64-linux-gnu' 'CFLAGS=-g -O2 -g -Wall -O2' > 'LDFLAGS=-Wl,-Bsymbolic-functions' 'CPPFLAGS=' 'CXXFLAGS=-g -O2 -g > -Wall -O2' --with-squid=/build/buildd/squid3-3.1.6 > > ubuntu@domU-12-31-39-0C-81-A1:~$ cat /etc/issue > Ubuntu 10.10 \n \l > --------------------------------------------------------------------------------------------------------------- > ubuntu@domU-12-31-39-0C-81-A1:~$ sudo cat /etc/squid3/squid.conf > ###################################### > # Squid3 config for crawl v2 machines > ###################################### > #Access control > acl manager proto cache_object > acl localhost src 127.0.0.1/32 > acl to_localhost dst 127.0.0.0/8 > acl SSL_ports port 443 > acl Safe_ports port 80 # http > acl CONNECT method CONNECT > http_access allow manager localhost > http_access deny manager > http_access deny !Safe_ports > http_access deny CONNECT !SSL_ports > http_access allow localhost > http_access deny all > icp_access deny all > htcp_access deny all > http_port 3128 > #Force minimum cache time for everything > refresh_pattern . 240 999999% 1440 override-expire override-lastmod > ignore-no-cache ignore-no-store ignore-private ignore-must-revalidate > #Cache 404s as well > negative_ttl 240 minutes > #No neighbor caches > icp_port 0 > cache_mem 256 MB > cache_dir ufs /mnt/cache 1000 16 256 > access_log /var/log/squid3/access.log squid > cache_log /var/log/squid3/cache.log > cache_store_log /var/log/squid3/store.log > coredump_dir /var/spool/squid3 > > --------------------------------------------------------------------------------------------------------------- > > ubuntu@domU-12-31-39-0C-81-A1:~$ sudo cat /var/log/squid3/access.log > 1293743891.380 309 127.0.0.1 TCP_MISS/404 822 GET > http://hotelburst.es/robots.txt - DIRECT/85.25.120.108 text/html > 1293743893.629 99 127.0.0.1 TCP_MISS/404 821 GET > http://hotelburst.es/robots.txt - DIRECT/85.25.120.108 text/html > > ubuntu@domU-12-31-39-0C-81-A1:~$ sudo cat /var/log/squid3/cache.log > 2010/12/30 21:17:45| 0 Objects expired. > 2010/12/30 21:17:45| 0 Objects cancelled. > 2010/12/30 21:17:45| 0 Duplicate URLs purged. > 2010/12/30 21:17:45| 0 Swapfile clashes avoided. > 2010/12/30 21:17:45| Took 0.01 seconds ( 0.00 objects/sec). > 2010/12/30 21:17:45| Beginning Validation Procedure > 2010/12/30 21:17:45| Completed Validation Procedure > 2010/12/30 21:17:45| Validated 25 Entries > 2010/12/30 21:17:45| store_swap_size = 0 > 2010/12/30 21:17:46| storeLateRelease: released 0 objects > > ubuntu@domU-12-31-39-0C-81-A1:~$ sudo cat /var/log/squid3/store.log > 1293743893.629 RELEASE -1 FFFFFFFF 9D744644E77923213EBD01054E48C903 > 404 1293743894 -1 -1 text/html 406/406 GET > http://hotelburst.es/robots.txt > --------------------------------------------------------------------------------------------------------------- > -- Chas Williams SEOmoz