On Fri, Feb 20, 2015 at 4:26 AM, Stephen Morton <stephen.c.morton@xxxxxxxxx> wrote: > By 'performance', I guess I mean speed of day to day operations for devs. > > * (Obviously, trivially, a (non-local) clone will be slow with a large repo.) > * Will a few simultaneous clones from the central server also slow down > other concurrent operations for other users? There are no locks in server when cloning, so in theory cloning does not affect other operations. Cloning can use lots of memory though (and a lot of cpu unless you turn on reachability bitmap feature, which you should). > * Will 'git pull' be slow? If we exclude the server side, the size of your tree is the main factor, but your 25k files should be fine (linux has 48k files). > * 'git push'? This one is not affected by how deep your repo's history is, or how wide your tree is, so should be quick.. Ah the number of refs may affect both git-push and git-pull. I think Stefan knows better than I in this area. > * 'git commit'? (It is listed as slow in reference [3].) > * 'git stautus'? (Slow again in reference 3 though I don't see it.) (also git-add) Again, the size of your tree. I'm trying to address problems in [3], but at your repo's size, I don't think you need to worry about it. > * Some operations might not seem to be day-to-day but if they are called > frequently by the web front-end to GitLab/Stash/GitHub etc then > they can become bottlenecks. (e.g. 'git branch --contains' seems terribly > adversely affected by large numbers of branches.) > * Others? git-blame could be slow when a file is modified a lot. -- Duy -- To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html