I have lately added new Git speed benchmark, from Bryan Murdock blog. The repository is bit untypical: <quote> By performance, I mean that I used the UNIX time command to see how long various basic operations took. Performing the various basic operations gave me some insight into the usability of each as well. For this test I used a directory with 266 MB of files, 258 KB of which were text files, with the rest being image files. I know, kind of weird to version all those binary files, but that was the project I was interested in testing this out on. Your mileage may vary and all that. Here’s a table summarizing the real times reported by time(1): </quote> If I remember correctly there were some patches to git which tried to better deal with large blobs. In this simple benchmark git was outperformed by Mercurial and even Bazaar-NG a bit. http://git.or.cz/gitwiki/GitBenchmarks#head-5657b8361895b5a02c0de39337c410e4d8dcdbce http://bryan-murdock.blogspot.com/2007/03/cutting-edge-revision-control.html -- Jakub Narebski Poland - To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html