On Thu, Feb 19, 2015 at 3:06 PM, Stephen Morton <stephen.c.morton@xxxxxxxxx> wrote: > > I think I addressed most of this in my original post with the paragraph > > "Assume ridiculous numbers. Let me exaggerate: say 1 million commits, > 15 GB repo, > 50k tags, 1,000 branches. (Due to historical code fixups, another > 5,000 "fix-up > branches" which are just one little dangling commit required to > change the code > a little bit between a commit and a tag that was not quite made from it.)" > > To that I'd add 25k files, > no major rewrites, > no huge binary files, but lots of a few MB binary files with many revisions. > > But even without details of my specific concerns, I thought that > perhaps the git developers know what limits git's performance even if > large projects like the kernel are not hitting these limits. > > Steve I did not realize you gave numbers below, as I started answering after reading the first paragraphs. Sorry about that. I think lots of files organized in a hierarchical fashion ranging in the small MB range is not a huge deal. Also history is a non issue The problem arises with having lots of branches. "640 git branches ought to be enough for everybody -- Linus" (just kidding) Git doesn't really scale efficiently with lots of branches (second hand information except for fetch/pull where I did some patches on another topic recently). Thanks, Stefan -- To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html