Alex Rousskov wrote > It is possible to avoid caching duplicate content, but that allows you > to handle cache hits more efficiently. It does not help with cache > misses (when the URL requested by the client has not been seen before). > > If content publishers start publishing content checksums and browsers > automatically add those checksums to requests, then you would have the > Utopia you dream about :-). This will not happen while content > publishers benefit from getting client requests more than they suffer > from serving those requests. I mean the contents which Squid is aware of them , like contents which Squid accessed until now . -- View this message in context: http://squid-web-proxy-cache.1019090.n4.nabble.com/Automatic-StoreID-tp4665140p4666002.html Sent from the Squid - Users mailing list archive at Nabble.com.