Re: Decompression speed: zip vs lzo

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 




On Fri, 11 Jan 2008, Sam Vilain wrote:
> 
> For reference, 20 years of Perl with very deep deltas:
> 
> wilber:~/src/perl-preview$ du -sk .git
> 73274   .git
> wilber:~/src/perl-preview$ git-repack -a
> Counting objects: 244360, done.
> Compressing objects: 100% (55493/55493), done.
> Writing objects: 100% (244360/244360), done.
> Total 244360 (delta 181061), reused 244360 (delta 181061)
> wilber:~/src/perl-preview$ du -sk .git/objects/pack/
> 75389   .git/objects/pack/

Hmm. I'm not sure I understand what this was supposed to show?

You reused all the old deltas, and you did "du -sk" on two different 
things before/after (and didn't do a "-a -d" to repack the old pack 
either). So does the result actually have anything to do with any 
compression algorithm?

Use "-a -d -f" to repack a whole archive.

			Linus
-
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]

  Powered by Linux