On 17/09/10 12:21, Guy Bashkansky wrote:
Here is the problem description, what solution might Squid or other
cache tools provide?
Some websites serve huge files, usually movies or binary distributions.
Typically a client issues byte range requests, which are not cacheable
as separate objects in Squid.
Waiting for the whole file to be brought into the cache takes way too
long, and is not granular enough for optimizations.
A possible solution would be if Squid (or other tool/plugin) knew how
to download huge files *in chunks*.
Then the tool would cache these chunks and transform them into
arbitrary ranges when serving client requests.
There are some possible optimizations, like predictive chunk caching
and cold chunks eviction.
Does anybody know how to put together such solution based on any existing tools?
Ah the BitTorrent-to-HTTP conversion. :)
Still blocked to a large degree by squid not fully supporting range
requests.
A server module for Squid like the FTP, wais and gopher ones is possibly
achievable. As is an eCAP/ICAP module.
HTTP still have the fixed basic requirements that one request equals one
reply and most annoyingly that ranges requested must not re-ordered or
optimized.
Amos
--
Please be using
Current Stable Squid 2.7.STABLE9 or 3.1.8
Beta testers wanted for 3.2.0.2