I've done the first cut of the featureset required to cache CDN dynamic content thats really static content at heart. This store url rewrite work lets one "rewrite" the URLs used only in store lookups and storage, preserving the original URL for forwarding decisions and ACL lookups. It allows a crafty administrator to teach Squid that multiple URLs actually map to the same resource, so if it sees a request for a URL thats equivalent to one in storage it'll return the cached object. This is specifically targetted at undoing the "damage" done by CDNs distributing content by using multiple origin hosts for the same content. More information is available in the Wiki: http://wiki.squid-cache.org/Features/StoreUrlRewrite The wiki has an example config file and helper for caching google maps/earth and a little bit of Youtube. I'd really, really appreciate it if people gave this stuff a whirl and kept some before and after statistics (or just the logfiles would be great!) to compare the hitrates for google maps and google earth. It'll require an upgrade to the very recent Squid-2.HEAD snapshot. Once this stuff is stable and I've nudged another Squid-2 release I'll start incoporating rules to cache microsoft updates and the rest of youtube. I warn you - it could be buggy. It could blow up in your face. I'll try to fix whatever bugs people see crop up. I'll offer a little help on the mailing list if the Wiki instructions aren't enough for people to get it going. (If you'd like to use this in a commercial setup and would like proper support for it then please contact me in private.) Adrian -- - Xenion - http://www.xenion.com.au/ - VPS Hosting - Commercial Squid Support -