Re: [PATCH] fast-import: Stream very large blobs directly to pack

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Thu, 28 Jan 2010, Shawn O. Pearce wrote:

> If a blob is larger than the configured big-file-threshold, instead
> of reading it into a single buffer obtained from malloc, stream it
> onto the end of the current pack file.  Streaming the larger objects
> into the pack avoids the 4+ GiB memory footprint that occurs when
> fast-import is processing 2+ GiB blobs.

Yeah.  I've had that item on my todo list for ages now.  This 
big-file-threshold principle has to be applied to 'git add' too so a big 
blob is stored in pack file form right away, and used to bypass delta 
searching in 'git pack-objects', used to skip the diff machinery, and so 
on.


Nicolas
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]