Shawn O. Pearce wrote:
Some workflows encourage users to create loose objects in their local repository and then upload them to a central location by way of git-push. During the git-push operation the end-user is expecting network latency to be the dominating factor and we are also very likely to be packing mostly loose objects for transport as the user is likely to be pushing their recent work, which is typically stored only in loose objects. By saving the packfile we are transferring over the network to a local file we can remove the corresponding loose objects from the objects directory and immediately benefit from the packing work that was done to perform the network transport. This is a form of `git gc --auto` that happens automatically anytime the user performs a push.
I'm sure this *could* be a good idea, but in my typical workflow I push about twice a day, and usually not more than 5-8 commits at a time. Since I'm a great believer in isolated changes, this usually contains changes to one or two files at a time. Is it worth the trouble saving 15-25 loose objects, creating two new packfiles / day? Otoh, I also rebase every once in a while, moving 50+ commits to some other branch, and then it would most definitely be worth it.
+ save_pack = 0; + else if (progress) + fprintf(stderr, "Also keeping saving packfile...\n");
keeping or saving? Pick one :) -- Andreas Ericsson andreas.ericsson@xxxxxx OP5 AB www.op5.se Tel: +46 8-230225 Fax: +46 8-230231 - To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html