Joshua Redstone <joshua.redstone@xxxxxx> writes: > I've managed to speed up git-commit on large repos by 4x by removing some > safeguards that caused git to stat every file in the repo on commits that > touch a small number of files. The diff, for illustrative purposes only, > is at: > > https://gist.github.com/1499621 > > > With a repo with 1 million files (but few commits), the diff drops the > commit time down from 7.3 seconds to 1.8 seconds, a 75% decrease. The > optimizations are: I do not know if these kind of changes are called "optimizations" or merely making the command record a random tree object that may have some resemblance to what you wanted to commit but is subtly incorrect. I didn't fetch your safety removal, though. Wouldn't you get a similar speed-up without being unsafe if you simply ran "git commit" without any parameter (i.e. write out the current index as a tree and make a commit), combined with "--no-status" and perhaps "-q" to avoid running the comparison between the resulting commit and the working tree state after the commit? -- To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html