Decompression speed: zip vs lzo

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



I have created a big tar from linux tree:

linux-2.6.tar   300,0 MB

Then I have created to compressed files with zip and lzop utility (the
latter uses the lzo compression algorithm):

linux-2.6.zip  70,1 MB

linux-2.6.tar.lzo  108,0 MB

Then I have tested the decompression speed:

$ time unzip -p linux-2.6.zip > /dev/null
3.95user 0.09system 0:04.05elapsed 99%CPU (0avgtext+0avgdata 0maxresident)k
0inputs+0outputs (0major+189minor)pagefaults 0swaps

$ time lzop -d -c linux-2.6.tar.lzo > /dev/null
2.10user 0.07system 0:02.18elapsed 99%CPU (0avgtext+0avgdata 0maxresident)k
0inputs+0outputs (0major+234minor)pagefaults 0swaps


So bottom line is that lzo decompression speed is almost the double of zip.


Marco

P.S: Compression size is better for zip but a more realistic test
would be to try with a delta packaged repo instead of a simple tar of
source files. Because delta packaged is already compressed in his way
perhaps difference in final file sizes is smaller.
-
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]

  Powered by Linux