Search squid archive

Re: Caching huge files in chunks?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 09/16/2010 06:21 PM, Guy Bashkansky wrote:
Here is the problem description, what solution might Squid or other
cache tools provide?

Some websites serve huge files, usually movies or binary distributions.
Typically a client issues byte range requests, which are not cacheable
as separate objects in Squid.
Waiting for the whole file to be brought into the cache takes way too
long, and is not granular enough for optimizations.

A possible solution would be if Squid (or other tool/plugin) knew how
to download huge files *in chunks*.
Then the tool would cache these chunks and transform them into
arbitrary ranges when serving client requests.
There are some possible optimizations, like predictive chunk caching
and cold chunks eviction.

Does anybody know how to put together such solution based on any existing tools?

Caching of partial responses is allowed by HTTP/1.1 but is not yet supported by Squid. It is a complex feature which can, indeed, be a useful optimization in some environments. For more information, please see

    http://wiki.squid-cache.org/Features/PartialResponsesCaching


http://wiki.squid-cache.org/SquidFaq/AboutSquid#How_to_add_a_new_Squid_feature.2C_enhance.2C_of_fix_something.3F

Thank you,

Alex.



[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux