Re: Incremental cvsimports

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 5/24/06, Geoff Russell <geoffrey.russell@xxxxxxxxx> wrote:
Dear Git,

Dear Geoff,

if you look at the list archive for the last couple of days, you'll
see there's been quite a bit of activity in tuning cvsimport so that
it scales better with large imports like yours. We have been playing
with a gentoo cvs repo with 300K commits / 1.6GB uncompressed.

Don't split up the tree... that'll lead to something rather ackward.
Instead, fetch and build git from Junio's 'master' branch which seems
to have collected most (all?) of the patches posted, including one
from Linus that will repack the repo every 1K commits -- keeping the
import size down.

You _will_ need a lot of memory though, as cvsps grows large (working
on a workaround now) and cvsimport grows a bit over time (where is
that last leak?!). And a fast machine -- specially fast IO. I've just
switched from an old test machine to an AMD64 with fast disks, and
it's importing around 10K commits per hour.

You will probably want to run cvsps by hand, and later use the -P flag.

cheers,


martin
-
: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]