Re: low memory system to clone larger repo

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Thu, Jan 8, 2015 at 11:10 PM, matthew sporleder <msporleder@xxxxxxxxx> wrote:
> I am attempting to clone this repo: https://github.com/jsonn/src/

This repo has 3.4M objects. Basic book keeping would cost 200MB (in
practice it'll be higher because I'm assuming no deltas in my
calculation). On my 64-bit system, it already uses 400+ MB at the
beginning of delta resolving phase, and is about 500MB during. 32-bit
systems cost less but I doubt we could keep it within 256 MB limit. I
think you just need more powerful machines for a repo this size.

Also, they have some large files (udivmodti4_test.c 16MB, MD5SUMS
6MB..) These giant files could make index-pack use more memory
especially if they are deltified. If you repack the repo with
core.bigFileThreshold about 1-2MB, then clone, you may get a better
memory consumption, but at the cost of bigger packs.
-- 
Duy
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html




[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]