Re: how to keep git-fetch from running out of memory?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Sat, May 14, 2011 at 22:24, Kartik Agaram <ak@xxxxxxxxxxxx> wrote:
> I have a git repo with some large files that I'm no longer able to
> update. git fetch keeps running out of memory:
>
>  fatal: Out of memory, malloc failed
>  fatal: unpack-objects died with error code 128
>
> Anybody know how to keep it from compressing the refs into packfiles?
> I've experimented with core.compression, pack.compression,
> pack.windowMemory, pack.packSizeLimit, all without luck :(

Instead of playing with these settings, try transfer.unpackLimit 1. It
will force the code to use index-pack rather than unpack-objects,
which has a different memory profile.

However, that may still be insufficient. A big object must still be
allocated in memory in order to compute its SHA-1. If you don't have
sufficient memory, you need to increase your ulimits, or reconfigure
your system to have more virtual memory available to the process.

-- 
Shawn.
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html


[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]