Re: "malloc failed"

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Wed, Jan 28, 2009 at 07:06:28PM -0500, David Abrahams wrote:

> Well, moving the 2.6G .dar backup binary out of the fileset seems to
> have helped a little, not surprisingly :-P

Ok, that _is_ big. ;) I wouldn't be surprised if there is some corner of
the code that barfs on a single object that doesn't fit in a signed
32-bit integer; I don't think we have any test coverage for stuff that
big.

But it may also just be that we are going to try malloc'ing 2.6G, and
that's making some system limit unhappy.

> I don't know whether anyone on this list should care about that failure
> given the level of abuse I'm inflicting on Git, but keep in mind that
> the system *does* have 8G of memory.  Conclude what you will from that,
> I suppose!

Well, I think you said before that you were never getting close to using
up all your memory. Which implies it's some system limit.

-Peff
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]

  Powered by Linux