On 5/5/08, Eric Wong <normalperson@xxxxxxxx> wrote: > Interesting. By "These commits seemed all to have thousands of files", > you mean the first 35K that took up most of the time? If so, yes, > that's definitely a problem... > > git-svn requests a log from SVN containing a list of all paths modified > in each revision. By default, git-svn only requests log entries for up > to 100 revisions at a time to reduce memory usage. However, having > thousands of files modified for each revision would still be > problematic, as would having insanely long commit messages. On my system, any branch that was created using "svn cp" of a toplevel directory seems to cause git-svn to (rather slowly) download every single file in the entire branch for the first commit on that branch, giving a symptom that sounds a lot like the above "commits with thousands of files". I assumed this was just an intentional design decision in git-svn, to be slow and safe instead of fast and loose. Is it actually supposed to do something smarter than that? Thanks, Avery -- To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html