Linus Torvalds, Thu, Jan 11, 2007 23:28:21 +0100: > > > > It must have been large leak, as I really have seen the memory usage > > dropping down significantly. > > I really think it was about 6MB (or whatever your index file size was) per > every single resolved file. I think merge-recursive used to flush the > index file every time it resolved something, and every flush would > basically leak the whole buffer used to write the index. Looks like. Resulting merge has about 10 files changed, all the other files are same, just got different ways on the branches to be merged. > Anyway, 40-50 sec on a fairly weak laptop for a 44k-file merge sounds like > git doesn't have to be totally embarrassed. I'm not saying we shouldn't be > able to do it faster, but it's at least _possible_ that a lot of the time > spent is now spent doing real work (ie maybe you actually have a fair > amount of file-level merging? Maybe it's 40-50 sec because there's some > amount of real IO going on, and a fair amount of real work done too?) Some work, definitely: git diff-tree branch1 branch2 lists 9042 differences, among them some on the same files. - To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html