My apologies for bringing this up again, but for what it's worth, this
git repository I can't even clone at depth 1:
$ git clone --depth 1 https://github.com/alf632/terrain3dglitch
Cloning into 'terrain3dglitch'...
remote: Enumerating objects: 697, done.
remote: Counting objects: 100% (697/697), done.
remote: Compressing objects: 100% (439/439), done.
error: RPC failed; curl 92 HTTP/2 stream 5 was not closed cleanly:
CANCEL (err 8)
error: 1754 bytes of body are still expected
fetch-pack: unexpected disconnect while reading sideband packet
fatal: early EOF
fatal: fetch-pack: invalid index-pack output
The problem seems to be possibly amplified by a timeout config issue
from github's side, but also made worse by depth 1 already being 100MB+.
Downloading that amount without resume isn't feasible for everyone. I'm
assuming if I need all files and sub dirs, there's no workaround here?
I don't want to waste anybody's time, I'm just hoping to provide some
further data points that in some edge cases, this can be impactful.
(And sorry if I did something silly while cloning and didn't realize.)
Regards,
Ellie
On 6/8/24 1:28 AM, ellie wrote:
Dear git team,
I'm terribly sorry if this is the wrong place, but I'd like to suggest a
potential issue with "git clone".
The problem is that any sort of interruption or connection issue, no
matter how brief, causes the clone to stop and leave nothing behind:
$ git clone https://github.com/Nheko-Reborn/nheko
Cloning into 'nheko'...
remote: Enumerating objects: 43991, done.
remote: Counting objects: 100% (6535/6535), done.
remote: Compressing objects: 100% (1449/1449), done.
error: RPC failed; curl 92 HTTP/2 stream 5 was not closed cleanly:
CANCEL (err 8)
error: 2771 bytes of body are still expected
fetch-pack: unexpected disconnect while reading sideband packet
fatal: early EOF
fatal: fetch-pack: invalid index-pack output
$ cd nheko
bash: cd: nheko: No such file or director
In my experience, this can be really impactful with 1. big repositories
and 2. unreliable internet - which I would argue isn't unheard of! E.g.
a developer may work via mobile connection on a business trip. The
result can even be that a repository is uncloneable for some users!
This has left me in the absurd situation where I was able to download a
tarball via HTTPS from the git hoster just fine, even way larger binary
release items, thanks to the browser's HTTPS resume. And yet a simple
git clone of the same project failed repeatedly.
My deepest apologies if I missed an option to fix or address this. But
summed up, please consider making git clone recover from hiccups.
Regards,
Ellie
PS: I've seen git hosters have apparent proxy bugs, like timing out
slower git clone connections from the server side even if the transfer
is ongoing. A git auto-resume would reduce the impact of that, too.