Hi all, We're using Squid as a reverse proxy for a couple of years now. We're currently migrating to Squid 2.6 and we're really satisfied of all the enhancements of this version, especially the fact that we can use a lot of features previously reserved to the proxy configuration. Our context is a bit special : - we have _a lot_ of traffic generated by robots (googlebot and so on) ; - the problem is that they visit a lot of pages on our site we don't really want to keep in cache (nobody else really visits them - for instance, the events in a very small city) ; - we want the robots to use the cache generated by the other users so we can't simply deny cache access to robots. After this introduction, here is my question: is there any way to deny the write access to the cache? I'd like to be able to say: all these user agents can use the cache but don't write anything in it. I don't find anything to do this sort of things. Thanks for any ideas. -- Guillaume