Hi, I am having an issue currently when using Git with a remote server which has a limited number of ssh connections. The ssh server sometimes closes connections due to too many concurrent connections. I will get the following output from git in this case when performing a submodule update of a submodule which is not yet currently cloned/checked out. stdout: Cloning into 'src/SHARED'... stderr: Total 10288 (delta 7577), reused 10190 (delta 7577) Received disconnect from 10.96.8.71: 7: Too many concurrent connections fatal: Could not read from remote repository. Please make sure you have the correct access rights and the repository exists. Unable to fetch in submodule path 'src/SHARED' The submodule is not cloned successfully, and this occurs somewhere in the middle of the process. If I run the command a 2nd time, git submodule update --remote src/SHARED, I get a successful run, but the files are not actually checked out. I believe this is because the clone that failed did succeed in getting the repository into a state where all the files are "removed" so a further submodule update will do nothing since it's "already" checked out at the correct commit. Am I right in my understanding? Is this a bug? I believe I can fix this using --force. Note that i don't yet currently have a reliable reproduction of this for various reasons, not least of which is that simulating network error is difficult. Any thoughts on this? Should I just have my script that runs my continuous integration builds add a check to ensure files are checked out? Is "--force" enough to get the submodule to be re-checked out even if it's already checked out at the location? Thanks, Jake -- To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html