Re: Git chokes on large file

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Wed, 28 May 2014, Dale R. Worley wrote:

From: Duy Nguyen <pclouds@xxxxxxxxx>

I don't know how many commands are hit by this. If you have time and
gdb, please put a break point in die_builtin() function and send
backtraces for those that fail. You could speed up the process by
creating a smaller file and set the environment variable
GIT_ALLOC_LIMIT (in kilobytes) to a number lower than that size. If
git attempts to allocate a block larger than that limit it'll die.

I don't use Git enough to exercise it well.  And there are dozens of
commands with hundreds of options.

As someone else has noted, if I run 'git commit -q --no-status', it
doesn't crash.

It seems that much of Git was coded under the assumption that any file
could always be held entirely in RAM.  Who made that mistake?  Are
people so out of touch with reality?

Git was designed to track source code, there are warts that show up in the implementation when you use individual files >4GB

such files tend to also not diff well. git-annex and other offshoots hae methods boled on that handle such large files better than core git does.

David Lang
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html




[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]