On 5/23/06, Linus Torvalds <torvalds@xxxxxxxx> wrote:
I don't think the remote usability is valid, except for some really small repositories. The fact that it takes hours even when the CVS server is local doesn't bode well for doing it remotely for any but the most trivial things.
I really don't think that using the local cvs binary is a problem at all. In my experience, the thing is fairly fast and optimized when you ask it to perform file-oriented questions and that's all we do, really. If you want to try it, you'll see that local checkouts of large trees (like this gentoo one) are fairly fast. Not as fast as GIT itself, but good enough. I think Donnie has hit a bug with a bad version of cvs, but other than that, my experience with it is that it is fairly well behaved -- even if the tool is bad, ubiquity has lead to resiliency over the years.
I really think it would be better to have local use be the optimized case, with remote being the "it's _possible_" case.
Agreed, but I think we won't see much benefit in direct parsing. And we'll have to take the hit of double-implementation. In any case, we have it already -- parsecvs does it quite well (modulo memory leaks!) and I've used it several times in conjunction with cvsimport. Just perform the initial import with parsecvs and then 'track' the remote project with cvsimport. The problem is that they lead to slightly different trees. So their output is not consistent, and I don't think that'll be easy to fix. cheers, martin - : send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html