Re: git-fetching from a big repository is slow

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Andy Parkins wrote:
Hello,

I've got a big repository. I've got two computers. One has the repository up-to-date (164M after repack); one is behind (30M ish).

I used git-fetch to try and update; and the sync took HOURS. I zipped the .git directory and transferred that and it took about 15 minutes to transfer.

Am I doing something wrong? The git-fetch was done with a git+ssh:// URL. The zip transfer with scp (so ssh shouldn't be a factor).


This seems to happen if your repository consists of many large binary files, especially many large binary files of several versions that do not deltify well against each other. Perhaps it's worth adding gzip compression detecion to git? I imagine more people than me are tracking gzipped/bzip2'ed content that pretty much never deltifies well against anything else.

--
Andreas Ericsson                   andreas.ericsson@xxxxxx
OP5 AB                             www.op5.se
Tel: +46 8-230225                  Fax: +46 8-230231
-
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]