Hi! I'm trying to upload large amount of data from a connection that often cuts. If the connection fails after uploading part of the data, it seems I need to start over from zero again. Is there a way to resume the upload instead? Here is an example: $ git push Counting objects: 504, done. Delta compression using up to 2 threads. Compressing objects: 100% (475/475), done. Write failed: Broken pipe1/476), 533.78 MiB | 60 KiB/s fatal: The remote end hung up unexpectedly error: pack-objects died of signal 13 error: failed to push some refs to 'myhost.org:/git/myrepo/' $ git push Counting objects: 504, done. Delta compression using up to 2 threads. Compressing objects: 100% (475/475), done. Writing objects: 12% (61/476), 89.34 MiB | 23 KiB/s Thanks, piem -- To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html