Re: Performance issue: initial git clone causes massive repack

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Tue, 2009-04-14 at 13:27 -0700, Robin H. Johnson wrote:
> On Tue, Apr 14, 2009 at 04:17:55PM -0400, Nicolas Pitre wrote:
> > WRT the HTTP protocol, I was questioning git's ability to resume the 
> > transfer of a pack in the middle if such transfer is interrupted without 
> > redownloading it all. And Mike Hommey says this is actually the case.
> With rsync:// it was helpful to split the pack, and resume there worked
> reasonably (see my other mail about the segfault that turns up
> sometimes).
> 
> More recent discussions raised the possibility of using git-bundle to
> provide a more ideal initial download that they CAN resume easily, as
> well as being able to move on from it.

Hey Robin,

Now that the GSoC projects have been announced I can give you the good
news that one of our two projects is to optimise this stage in
git-daemon; I'm hoping we can get it down to being almost as cheap as
the workaround you described in your post.  I'll certainly be using your
repository as a test case :-)

So stay tuned!
Sam.

--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]