Drew Northup <drew.northup@xxxxxxxxx> writes: >> That's why. See gc.autopacklimit in "git help config" -- by default, >> git will gc if there are more than 50 pack files. > > Do we want to consider ignoring (or automatically doubling, or something > like that) gc.autopacklimit if that number of packs meet or exceed > gc.packSizeLimit? I have no idea what the patch for this might look > like, but it seems to make more sense than this situation. This is unrelated to the auto-gc, but it also would be fruitful to question if it is a sane setting to limit packfiles to 30M, when the repository needs 100 of them (total around 3G??). Just like having too many loose object files degrade performance (and that is one of the reasons we pack them in the first place), having many packs will degrade performance unnecessarily and to a worse degree, as "check which pack has this particular object" code has to examine all packs, unlike the loose object case where we let the .git/objects/?? fan-out to give us some hashing and the filesystem to do the heavylifting for us. -- To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html