On Mon, 5 Jun 2006, Nicolas Pitre wrote: > > In other words, the pack shrunk to less than half the size of the > previous one ! Ok, that's a bit more extreme than expected. It's obviously great news, and says that the approach of sorting by "reversed name" is a great heuristic, but at the same time it makes me worry a bit that this thing that is supposed to be a heuristic ends up being _so_ important from a pack size standpoint. I was happier when it was more about saving a couple of percent. Now, your repo may be a strange case, and it just happens to fit the suggested hash, but on the other hand it's nice to see three totally different repositories that all improve, albeit with wildly different numbers. I'm wondering if we could have some "incremental optimizer" thing that would take a potentially badly packed archive, and just start looking for better delta chain possibilities? That way we would still try to get a good initial pack with some heuristic, but we could have people run the incremental improver every once in a while looking for good deltas that it missed due to the project not fitting the heuristics.. The fact that we normally do incremental repacking (and "-f" is unusual) is obviously one thing that makes us less susceptible to bad patterns (and is also what allows us to run the incremental optimizer - any good delta choice will automatically percolate into subsequent versions, including packs that have been cloned). So the packing strategy itself seems to be very stable (and partly _due_ to the "optimization" to re-use earlier pack choices), but we currently lack the thing that fixes up any initial bad assumptions in case they happen. Linus - : send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html