Re: Compression speed for large files

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Elrond <elrond+kernel.org <at> samba-tng.org> writes:

> 
> Joachim B Haga <cjhaga <at> fys.uio.no> writes:
> [...]
> > In this case Z_BEST_COMPRESSION also compresses worse,
> [...]
> 
> I personally find that very interesting, is this a known "issue" with zlib?
> It suggests, that with different options, it's possible to create smaller
> repositories, despite the 'advertised' (by zlib, not git) "best" compression.

There are also other tunables in zlib, such as the balance between Huffman
coding (good for data files) and string matching (good for text files). So with
more knowledge of the data it should be possible to compress even better. I'm
not advocating tuning this in git though ;)

> 
> Alex Riesen <raa.lkml <at> gmail.com> writes:
> [...]
> > Probably yes, as a per-repo config option.
> 
> The option probably should be the size for which to start using
> "default" compression.

That is possible, too. I'm open to any decision or consensus, as long as I get
my commits in less than 10s :)

-j.

-
: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]