Large repo and pack.packsizelimit

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hello,

I am using MSysgit 1.7.9 on WinXP 32bit and have a very large repo (10GB in .git; 20GB in source tree).
I had to set pack.packsizelimit=1024MB to prevent "out of memory" during repacking in git-gc 
and everything seemed to work fine.

When I tried to clone this repo an "out of memory" occured because the packs to be transferred
by the git protocol are not limited by pack.packsizelimit. I "fixed" this by setting transfer.unpackLimit=100000
and thus transferring only loose objects. This is very slow but it works.

In this cloned repo now git-gc again causes "out of memory" because it tries to pack all loose
objects in one go thereby seemingly not respecting pack.packsizelimit ... 
(Setting --window-memory=512m in git-repack did not help here.)

Am I doing anything wrong here or is this a bug/feature in git?

BTW1 Repo is very large but contains only one really large file with 1.2GB; all other files are smaller than 256MB.
BTW2 I cannot use 1.7.10 due to the http authorization bug.


---
Thomas
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html


[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]