Re: git clone out of memory. alternatives?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 




On Thu, 27 Mar 2008, Jeffrey Chang wrote:
> 
> Is there any way to get around this problem?  For example:
> - Can I run git clone in a way that uses less memory, such as cloning
> a piece of the repository at a time?

Cloning really is pretty memory-intensive, because it involves going 
through every single object.

What you *can* do is to limit cloning to the rsync protocol, which is 
strictly quite horrible (none of the inherent sanity-checks of the native 
protocol), but it avoids the server-side costs.

However, you'll eventually hit other problems, like the fact that you also 
won't be able to do a full repack on the server side (because a full 
repack does the same thing).

> - Can I export the entire repository as a file that can be loaded on
> my target machine, like "svnadmin dump" for subversion?

That's essentially what a repack does. See above about the problem.

> - Can I just rsync the repository from another computer that already has a copy?

Yes. You can. And you can also repack on another host and then rsync the 
results back to the server. It's not pretty, but it should work.

		Linus
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]

  Powered by Linux