On Aug 12, 2008, at 23:15, Shawn O. Pearce wrote:
Geert Bosch <bosch@xxxxxxxxxxx> wrote:
I've always felt that keeping largish objects (say anything >1MB)
loose makes perfect sense. These objects are accessed infrequently,
often binary or otherwise poor candidates for the delta algorithm.
Sadly this causes huge problems with streaming a pack because the
loose object has to be inflated and then delfated again to fit into
the pack stream.
Sure, but that really is not that much of an issue. For people
with large systems connected by very fast networks, the current
situation is probably fine, and spending a lot of effort for
packing often makes sense.
However, for a random repository of Joe User, all the effort spent
on packing will probably never be gained back. Most people just
suck content from upstream and at most maintain a couple of local
hacks on top of that. Little or nothing is ever pushed to other
systems.
Even when pushing to other systems, this often is just a handful of
objects
though a slow line and compression/decompression speeds just don't
matter
much.
The new style loose object format was meant to fix this problem,
and it did, but the code was difficult to manage so it was backed
out of the tree.
One nice optimization we could do for those pesky binary large objects
(like PDF, JPG and GZIP-ed data), is to detect such files and revert
to compression level 0. This should be especially beneficial
since already compressed data takes most time to compress again.
-Geert
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at http://vger.kernel.org/majordomo-info.html