On Sun, 2010-04-04 at 01:23 +0200, Frans Pop wrote: > On Saturday 03 April 2010, Michael Witten wrote: > > On Fri, Apr 2, 2010 at 16:05, Frans Pop <elendil@xxxxxxxxx> wrote: > > > I haven't had the patience to let it finish > > > > There's your problem. > > Yes, I had seen that. But there's a difference between taking much more > time and slowing down to such an extend that it never finishes. > > I've tried it today on my linux-2.6 repo as well and the same thing > happened. At first the progress is not fast but reasonable. When it gets > to about 45% percent it starts slowing down a lot: from ~1500 objects per > update of the counters to ~300 objects per update. And who knows what the > progress is going to be when it reaches 70% or 90%: 10 per update? > > With a total of over 2 milion objects in the repository such a low speed is > simply not going to work, ever. So I maintain that it is effectively > unusable. As a data point, when I do gc, I routinely use --aggressive. It takes a while here, but not forever. (I'm a tad short of 2 million objects) Repo is mainline + next + tip + stable >= 2.6.22 + local branches. git@marge:..git/linux-2.6> time git gc --aggressive Counting objects: 1909894, done. Delta compression using up to 4 threads. Compressing objects: 100% (1889774/1889774), done. Writing objects: 100% (1909894/1909894), done. Total 1909894 (delta 1674098), reused 0 (delta 0) real 22m24.943s user 55m33.756s sys 0m8.149s git is 1.7.0.3 -Mike -- To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html