Re: Make `git fetch --all` parallel?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Stefan Beller <sbeller@xxxxxxxxxx> writes:

> So I do think it would be much faster, but I also think patches for this would
> require some thought and a lot of refactoring of the fetch code.
> ...
> During the negotiation phase a client would have to be able to change its
> mind (add more "haves", or in case of the parallel fetching these become
> "will-have-soons", although the remote figured out the client did not have it
> earlier.)

Even though a fancy optimization as you outlined might be ideal, I
suspect that users would be happier if the network bandwidth is
utilized to talk to multiple remotes at the same time even if they
end up receiving the same recent objects from more than one place in
the end.

Is the order in which "git fetch --all" iterates over "all remotes"
predictable and documented?  If so, listing the remotes from more
powerful and well connected place to slower ones and then doing an
equivalent of stupid

	for remote in $list_of_remotes_ordered_in_such_a_way
	do
		git fetch "$remote" &
		sleep 2
	done

might be fairly easy thing to bring happiness.



[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]