Re: pack operation is thrashing my server

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Nicolas Pitre <nico@xxxxxxx> writes:

> On Tue, 12 Aug 2008, Geert Bosch wrote:
> 
> > One nice optimization we could do for those pesky binary large objects
> > (like PDF, JPG and GZIP-ed data), is to detect such files and revert
> > to compression level 0. This should be especially beneficial
> > since already compressed data takes most time to compress again.
> 
> That would be a good thing indeed.

Perhaps take a sample of some given size and calculate entropy in it?
Or just simply add gitattribute for per file compression ratio...

-- 
Jakub Narebski
Poland
ShadeHawk on #git
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]

  Powered by Linux