Re: Compression speed for large files

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Mon, Jul 03, 2006 at 11:13:34AM +0000, Joachim B Haga wrote:

> often binary. In git, committing of large files is very slow; I have
> tested with a 45MB file, which takes about 1 minute to check in (on an
> intel core-duo 2GHz).

I know this has already been somewhat solved, but I found your numbers
curiously high. I work quite a bit with git and large files and I
haven't noticed this slowdown. Can you be more specific about your load?
Are you sure it is zlib?

On my 1.8Ghz Athlon, compressing 45MB of zeros into 20K takes about 2s.
Compressing 45MB of random data into a 45MB object takes 6.3s. In either
case, the commit takes only about 0.5s (since cogito stores the object
during the cg-add).

Is there some specific file pattern which is slow to compress? 

-Peff
-
: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]