"Jon Smirl" <jonsmirl@xxxxxxxxx> wrote: > On 6/10/06, Junio C Hamano <junkio@xxxxxxx> wrote: > > "Jon Smirl" <jonsmirl@xxxxxxxxx> writes: > > > > > Here's a new transport problem. When using git-clone to fetch Martin's > > > tree it kept failing for me at dreamhost. I had a parallel fetch > > > running on my local machine which has a much slower net connection. It > > > finally finished and I am watching the end phase where it prints all > > > of the 'walk' messages. The git-http-fetch process has jumped up to > > > 800MB in size after being 2MB during the download. dreamhost has a > > > 500MB process size limit so that is why my fetches kept failing there. > > > > The http-fetch process uses by mmaping the downloaded pack, and > > if I recall correctly we are talking about 600MB pack, so 500MB > > limit sounds impossible, perhaps? > > The fetch on my local machine failed too. It left nothing behind, now > I have to download the 680MB again. That's sad. Could git-clone be changed to not remove .git directory if fetching objects fails (after other files in the .git directory have been fetched)? You could then hopefully continue with git-pull. -- http://onion.dynserv.net/~timo/ - : send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html