Hello, I am working with a large-ish git repository set up as a shared repository. I am the only user of this repository, and I access it over ssh. I am running into a problem where I can no longer clone the repository on a new computer. I get a message that seems to indicate that the machine with the repository is running out of memory. xigua:~/remotecvs jchang$ git clone [...] Initialized empty Git repository in [...] remote: Generating pack... remote: Done counting 9122 objects. remote: Deltifying 9122 objects... error: git-upload-pack: git-pack-objects died with error. fatal: git-upload-pack: aborting due to possible repository corruption on the remote side. remote: fatal: out of memoryremote: remote: aborting due to possible repository corruption on the remote side. fatal: early EOF fatal: index-pack failed fetch-pack from [...] failed. xigua:~/remotecvs jchang$ Moving the repository to another machine is not an option, nor is adding more RAM to that machine. Is there any way to get around this problem? For example: - Can I run git clone in a way that uses less memory, such as cloning a piece of the repository at a time? - Can I export the entire repository as a file that can be loaded on my target machine, like "svnadmin dump" for subversion? - Can I just rsync the repository from another computer that already has a copy? - Are there any other work-arounds to set up a copy of the repository on my local machine? Thanks, Jeff -- To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html