Re: 16 gig, 350,000 file repository

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Mon, 22 Feb 2010, Bill Lear wrote:

> On Thursday, February 18, 2010 at 15:58:42 (-0500) Nicolas Pitre writes:
> >On Thu, 18 Feb 2010, Bill Lear wrote:
> >
> >> I'm starting a new, large project and would like a quick bit of advice.
> >> 
> >> Bringing in a set of test cases and other files from a ClearCase
> >> repository resulted in a 350,000 file git repo of about 16 gigabytes.
> >> 
> >> The time to clone over a fast network was about 250 minutes.  I could
> >> not verify if the repo had been packed properly, etc.
> >
> >I'd start from there.  If you didn't do a 'git gc --aggressive' after 
> >the import then it is quite likely that your repo isn't well packed.
> >
> >Of course you'll need a big machine to repack this.  But that should be 
> >needed only once.
> 
> Ok, well they have a "big machine", but not big enough.  It's running
> out of memory on the gc.  I believe they have a fair amount of memory:
> 
> % free
>              total       used       free     shared    buffers     cached
> Mem:      16629680   16051444     578236          0      28332   14385948
> -/+ buffers/cache:    1637164   14992516
> Swap:      8289500       1704    8287796
> 
> and they are using git 1.6.6.

Hmmm. OK.

You might try:

	git repack -a -f -d --depth=200 --window=100 --window-memory=1g


Nicolas
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]