Re: [PATCH WIP] sha1-lookup: make selection of 'middle' less aggressive

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Jan 1, 2008 9:40 AM, Marco Costalba <mcostalba@xxxxxxxxx> wrote:
> On Jan 1, 2008 7:36 AM, Jeff King <peff@xxxxxxxx> wrote:
> >
> > The packfile is noticeably larger (55M versus 40M)
>
> Well 55M versus 40M is _only_  27% of compression ratio. It means that
> the compression algorithm is not so fundamental because the data is
> already, how to say, well packaged.
>

I think zlib is a very good general purpose algorithm, but is main
strength is to give good final file sizes, it is mainly intended for
files that are seldom decompressed.

For the use we do in git IMHO it would seem appropriate to look for
algorithms used in the field of filesystem compression, where
decompression penalty is a design goal. I know very little about this
but I think among kernel people, expert and competent hackers should
not be difficult to find, given that compressed filesystem are around
from many years under linux/fs/ directory.

Marco
-
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]

  Powered by Linux