Re: Large repo and pack.packsizelimit

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Nguyen Thai Ngoc Duy <pclouds <at> gmail.com> writes:

> 
> On Wed, May 9, 2012 at 4:36 PM, Thomas <th.acker66 <at> arcor.de> wrote:
> > To be exact I did the clone locally on the same machine and so the clone 
itself
> > worked
> > but I got the OOM during the first fetch. I "fixed" this by setting
> > transfer.unpacklimit=100000
> > which caused only loose objects to be transfered.
> > So in this case I think the OOM was on the remote side. But there is another 
OOM
> > if I try to repack locally.
> > It seems to me that neither pack-objects nor index-pack respekt
> > pack.packsizelimit and always
> > try to pack all objects to be transferred resp. all local loose objects in 
one
> > pack.
> > I could live wth the transfer.unpacklimit=100000 but the local OOM stops me 
from
> > using the cloned repo.
> 
> I have some patches to make index-pack work better with large blobs
> but they're not ready yet. I think pack-objects works fine with large
> blobs as long as they are all in packs. Are there any loose objects on
> the source repo?
> 
> It's strange that you chose "256mb" as the upper limit for small
> objects in your first mail. Do you have a lot of >=10mb files? By
> default, files smaller than 512mb will be put in memory for delta. A
> lot of big (but smaller than 512mb) files can quickly consume all
> memory. If it's the case, maybe you can lower core.bigFileThreshold
> 
> Also maybe try remove the 1.2GB file from the source repo and see if
> it works better. That could give us some hints where the problem is.

I am using core.bigFileThreshold=256MB already; so the large file/s should not 
be the problem (most of the files in the repo are "standard" source code files;
I tried even smaller numbers for bigFileThreshold and packsizelimit but with no 
success). 
As long as I worked with the original repo which was updated regularily all 
worked well as soon as pack.packsizelimit was set to 1024MB (even with the 1.2GB 
file). Repack seems not to increase a pack further as soon as packsizelimit is 
exceeded (so my packs are all slightly larger than 1024MB) BUT it also seems to 
try to put everything in one pack regardless of packsizelimit in the following 
cases:
(1) all objects to be transferred to another repo 
(2) all loose objects when starting a local repack
Case (1) can be fixed by transfer.unpacklimit but there is no fix for (2).
---
Thomas


--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html


[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]