I'm not subscribed, please Cc me on replies. My company is considering switching from CVS to a more branch-friendly version-control tool, and so of course we've been playing with git. We imported our CVS repository into git with git-cvsimport, which worked well enough, and resulted in a tree about the same size as the official kernel repository: 454121 objects, 334977 deltas. However, operations like 'git-fetch' take much, much longer in our repository than in the kernel repository: a git-fetch that pulls no updates in the kernel repository takes 1.7s, while our repository (fetching from one repository to a clone on the same local disk) takes about 20 seconds. After some experimentation, we discovered that deleting all the 5557 imported CVS tags made things fast again. (Interestingly, "git-fetch --no-tags" was not appreciably quicker, while the tags were still around) I searched the mailing list archives for similar problems, and the closest thread I could find was this one: http://thread.gmane.org/gmane.comp.version-control.git/20682/ ...however, that thread seems to have decided that large numbers of binary files were the problem, which is not the case in our repository. Does git have known scalability problems with large numbers of tags? Is there anything we can do to mitigate this slowdown, apart from just not using git's tag feature at all? Are there any details I've overlooked or misunderstood? Tim Allen - To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html