On 11/06/2019 11:29, Xeno Amess wrote:
Not all people can afford stable network connection...
I'm now suffered from connection lost and have to restart the clone/push fully.
It's OK when dealing with a project of several MB size, but when I try
to clone a 2GB sized repo I never succeed.
So I wonder, if there should be some way to resume from break-point for git?
For example what if we support something like Content-Range in http,
and make the download be split into smaller files, then make some way
to auto-resume if some file block failed hash-check?
Is this on a Windows machine using Git for Windows, or a Linux machine,
or even 'Windows Subsystem for Linux (WSL)" on windows?
If it is Windows then you may have hit the "sizeof(long)=32bit" limit.
If it is Linux then maybe adjust your refspec to limit the size of each
fetch or push. There has also been a recent change discussed on list
that added a config variable for some sort of rate limit limit but
haven't looked back at the mail archive to try to remember details - try
searching the https://public-inbox.org/git/
Philip