Re: Performance issue: initial git clone causes massive repack

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Tue, Apr 14, 2009 at 04:17:55PM -0400, Nicolas Pitre wrote:
> WRT the HTTP protocol, I was questioning git's ability to resume the 
> transfer of a pack in the middle if such transfer is interrupted without 
> redownloading it all. And Mike Hommey says this is actually the case.
With rsync:// it was helpful to split the pack, and resume there worked
reasonably (see my other mail about the segfault that turns up
sometimes).

More recent discussions raised the possibility of using git-bundle to
provide a more ideal initial download that they CAN resume easily, as
well as being able to move on from it.

So, from the Gentoo side right now, we're looking at this:
1. Setup git-bundle for initial downloads.
2. Disallow initial clones over git:// (allow updates ONLY)
3. Disallow git-over-http, git-over-rsync.

This also avoids the wait time with the initial clone. Just grab the
bundle with your choice of rsync or http, check it's integrity, throw it
into your repo, and update to the latest tree.

-- 
Robin Hugh Johnson
Gentoo Linux Developer & Infra Guy
E-Mail     : robbat2@xxxxxxxxxx
GnuPG FP   : 11AC BA4F 4778 E3F6 E4ED  F38E B27B 944E 3488 4E85

Attachment: pgpthnRfkmzMf.pgp
Description: PGP signature


[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]