On Tue, Aug 6, 2013 at 9:38 AM, Ramkumar Ramachandra <artagnon@xxxxxxxxx> wrote: > + Garbage collect using a pseudo logarithmic packfile maintenance > + approach. This approach attempts to minimize packfile churn > + by keeping several generations of varying sized packfiles around > + and only consolidating packfiles (or loose objects) which are > + either new packfiles, or packfiles close to the same size as > + another packfile. I wonder if a simpler approach may be nearly efficient as this one: keep the largest pack out, repack the rest at fetch/push time so there are at most 2 packs at a time. Or we we could do the repack at 'gc --auto' time, but with lower pack threshold (about 10 or so). When the second pack is as big as, say half the size of the first, merge them into one at "gc --auto" time. This can be easily implemented in git-repack.sh. -- Duy -- To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html