Re: [FYI] very large text files and their problems.

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Wed, Feb 22, 2012 at 10:49 PM, Ian Kumlien <pomac@xxxxxxxxx> wrote:
> Hi,
>
> We just saw a interesting issue, git compressed a ~3.4 gb project to ~57 mb.

How big are those files? How many of them? How often do they change?

> But when we tried to clone it on a big machine we got:
>
> fatal: Out of memory, malloc failed (tried to allocate
> 18446744072724798634 bytes)
>
> This is already fixed in the 1.7.10 mainline - but it also seems like

Does 1.7.9 have this problem?

> git needs to have atleast the same ammount of memory as the largest
> file free... Couldn't this be worked around?
>
> On a (32 bit) machine with 4GB memory - results in:
> fatal: Out of memory, malloc failed (tried to allocate 3310214313 bytes)
>
> (and i see how this could be a problem, but couldn't it be mitigated? or
> is it bydesign and intended behaviour?)

I think that it's delta resolving that hogs all your memory. If your
files are smaller than 512M, try lower core.bigFileThreshold. The
topic jc/split-blob, which stores a big file are several smaller
pieces, might solve your problem. Unfortunately the topic is not
complete yet.
-- 
Duy
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html


[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]