Nicolas Pitre wrote: > >> Well that depends how many loose objects there are :) I heard about on >> Windows a case where packing 30k loose objects took over an hour. > > And your patch cannot change anything to that, right? > You shouldn't wait until 30k loose objects accumulate before repacking. Absolutely. But the generational approach would make it practical to repack very often, in fact automatically, and keep the time of that repack reasonably bounded. As I said in the original patch, the intention is that this style of repacking works best when applied automatically, perhaps on commit. >> If you repack every 100 objects without -a, sure it will be fast, but >> you'll end up with too many packs. > You just need to adjust this treshold of 100 objects. > And latest GIT behaves _much_ better with lots of packs, almost like if > there was only one pack. See the test results I posted to the list. The test run with 66 packs? What about a repository with 300000 objects that was repacked every 100 new objects, would it work well with with 3000 packs? Surely there is some value in conglomerating older packs, even though your LRU patch might make it very small (which I do find impressive). Do you really need me to prove this with a benchmark? "Adjust the threshold" you have already said. But the problem with that is what to? I want the automatic repack to be *fast*. So making it larger will make it slower. But making it too small will make the packs excessive. Sam. - To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html