Search squid archive

Re: Duplicate files, content distribution networks

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 14/06/12 03:33 AM, Amos Jeffries wrote:
On 14/06/2012 8:53 p.m., Jack Bates wrote:
Another idea is to exploit RFC 3230, Instance Digests in HTTP. Given a
response with a "Location: ..." header and a "Digest: ..." header, if
the "Location: ..." URL isn't already cached then the proxy checks the
cache for content with a matching digest and rewrites the "Location:
..." header with the cached URL if found

I am working on a proof of concept plugin for Apache Traffic Server as
part of the Google Summer of Code. The code is up on GitHub [2]

If this is a reasonable approach, would it be difficult to build
something similar for Squid?

Please contact Alex Rousskov at measurement-factory.com, he was
organising a project to develop Digest handling and de-duplication this
a while back.

Thank you Amos for this info, I will definitely contact Alex Rousskov


[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux