Re: Errors cloning large repo

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 




On Sat, 10 Mar 2007, Martin Waitz wrote:
> 
> On Sat, Mar 10, 2007 at 01:01:44AM -0500, Shawn O. Pearce wrote:
> > Its very likely this did fit in just under 4 GiB of packed data,
> > but as you said, without O_LARGEFILE we can't work with it.
> 
> but newer git version can cope with it:
> 
> -r--r--r-- 1 martin martin 3847536413 18. Feb 10:36 pack-ffe867679d673ea5fbfa598b28aca1e58528b8cd.pack

Are you sure you're not just running a 64-bit process?

64-bit processes don't need O_LARGEFILE to process files larger than 2GB, 
since for them, off_t is already 64-bit.

Grepping for O_LARGEFILE shows nothing.

Oh, except we have that 

	#define _FILE_OFFSET_BITS 64

which is just a horrible hack. That's nasty. We should just use 
O_LARGEFILE rather than depend on some internal glibc thing that works 
nowhere else.

		Linus

-
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]