Search squid archive

Re: Denying write access to the cache

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Guillaume Smet wrote:
Hi all,

We're using Squid as a reverse proxy for a couple of years now. We're
currently migrating to Squid 2.6 and we're really satisfied of all the
enhancements of this version, especially the fact that we can use a
lot of features previously reserved to the proxy configuration.

Our context is a bit special :
- we have _a lot_ of traffic generated by robots (googlebot and so on) ;
- the problem is that they visit a lot of pages on our site we don't
really want to keep in cache (nobody else really visits them - for
instance, the events in a very small city) ;
- we want the robots to use the cache generated by the other users so
we can't simply deny cache access to robots.

After this introduction, here is my question: is there any way to deny
the write access to the cache? I'd like to be able to say: all these
user agents can use the cache but don't write anything in it.
I don't find anything to do this sort of things.

Thanks for any ideas.

--
Guillaume

Make an ACL that matches the URLs you don't want cached, and deny caching based on those ACLs.

acl CITY urlpath_regex city1 city2 city3
cache deny CITY

If you want to cache these infrequently visited URLs when regular people visit, add an acl that matches the user-agent of the bots.

acl CITY urlpath_regex city1 city2 city3
acl GOOGLEBOT browser -i googlebot
cache deny CITY GOOGLEBOT

Chris

[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux