Re: Decompression speed: zip vs lzo

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



> This is really the big point here.  Git uses _lots_ of *small* objects,
> usually much smaller than 12KB.  For example, my copy of the gcc
> repository has an average of 270 _bytes_ per compressed object, and
> objects must be individually compressed.
>
> Performance with really small objects should be the basis for any
> Git compression algorithm comparison.

If it so happens that one algorithm does much better on small objects
while another does better on large objects, there really is nothing that
prevents using both in a repository.  It's a bit of code bloat, of course.

Morten
-
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]

  Powered by Linux