Re: [PATCH v2] fast-import: Stream very large blobs directly to pack

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Jakub Narebski <jnareb@xxxxxxxxx> wrote:
> "Shawn O. Pearce" <spearce@xxxxxxxxxxx> writes:
> 
> >  Documentation/git-fast-import.txt |    6 ++
> >  fast-import.c                     |  172 +++++++++++++++++++++++++++++++++----
> >  t/t5705-clone-2gb.sh              |    2 +-
> >  t/t9300-fast-import.sh            |   46 ++++++++++
> >  4 files changed, 208 insertions(+), 18 deletions(-)
> 
> > +--big-file-threshold=<n>::
> > +	Maximum size of a blob that fast-import will attempt to
> > +	create a delta for, expressed in MiB.  The default is 512.
> > +	Some importers may wish to lower this on systems with
> > +	constrained memory.
> > +
> 
> Shouldn't there be config variable corresponding to this command line
> option, so that it can be set once for [constrained] system?

Implemented as core.bigFileThreshold in this patch... but I didn't
document it...

-- 
Shawn.
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]