Re: git-clone and unreliable links?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Wed, Nov 7, 2012 at 7:35 AM, Josef Wolf <jw@xxxxxxxxxxxxx> wrote:
> When using git-clone over an unreliable link (say, UMTS) and the network goes
> down, git-clone deletes everything what was downloaded. When the network goes
> up again and you restart git-clone, it has to start over from the
> beginning. Then, eventually, the network goes down again, and everything is
> deleted again.
>
> Is there a way to omit the deleting step, so the second invocation would start
> where the first invocation was interrupted?

No, because a clone is not resumable.

The best way to obtain a repository over an unstable link is to ask
the repository owner to make a bundle file with `git bundle create
--heads --tags` and serve the file using standard HTTP or rsync, which
are resumable protocols. After you download the file, you can clone or
fetch from the bundle to initialize your local repository, and then
run git fetch to incrementally update to anything that is more recent
than the bundle's creation.
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html


[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]