On Tue, 23 May 2006, Martin Langhoff wrote: > > I really don't think that using the local cvs binary is a problem at > all. In my experience, the thing is fairly fast and optimized when you > ask it to perform file-oriented questions and that's all we do, > really. Fair enough. My worry was mainly that the cvs server was doing something stupid, but I suspect most of the fork/exec's are probably from the cvsimport perl script itself. > In any case, we have it already -- parsecvs does it quite well (modulo > memory leaks!) and I've used it several times in conjunction with > cvsimport. Just perform the initial import with parsecvs and then > 'track' the remote project with cvsimport. I didn't get parsecvs working when I tried it a long time ago, and Donnie reported that it ran out of memory, so I didn't even really consider it. I'd love for it to work well, and it may be reasonable to do really big imports on multi-gigabyte 64-bit machines (after all, they aren't _hard_ to find any more, and you only need to do it once). That said, it still seems pretty stupid to require that much memory just to import from CVS. Linus - : send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html