Piotr Krukowiecki <piotr.krukowiecki@xxxxxxxxx> writes: >> Packing time depends on a number of factors. One of them is the number >> of unpacked objects to process. With 1.7 million objects, yes, its >> going to take some time. > > Any statistics how long it should take? Packing time depends on the repository, your machine and how you pack, so such statistics would be useful only in comparable contexts. linux-3.0/master$ time git repack -a -d Counting objects: 2138578, done. Delta compression using up to 4 threads. Compressing objects: 100% (327257/327257), done. Writing objects: 100% (2138578/2138578), done. Total 2138578 (delta 1791983), reused 2138009 (delta 1791434) real 1m40.528s user 1m22.805s sys 0m3.788s linux-3.0/master$ git count-objects -v count: 0 size: 0 in-pack: 2138578 packs: 1 size-pack: 487957 prune-packable: 0 garbage: 0 This is on my box [*1*] that is idle (other than running the repack). The above is starting from an already reasonably well packed state and reuses deltas; with "-f" to repack everything from scratch it would take significantly longer: linux-3.0/master$ time git repack -a -d -f Counting objects: 2138578, done. Delta compression using up to 4 threads. Compressing objects: 100% (2118691/2118691), done. Writing objects: 100% (2138578/2138578), done. Total 2138578 (delta 1749156), reused 344219 (delta 0) real 3m26.750s user 8m41.857s sys 0m6.716s Larger "window" tends to make the process take longer (I think it grows squared) but may reduce both the resulting packsize and runtime access overhead. Larger "depth" does not affect time to pack and helps reducing the resulting packsize, with increased runtime access overhead (i.e. not really recommended). [Footnote] *1* http://gitster.livejournal.com/34818.html Intel(R) Core(TM)2 Quad CPU Q9450 @ 2.66GHz with 8GB memory. -- To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html