Re: git-fetching from a big repository is slow

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Thu, 14 Dec 2006, Andreas Ericsson wrote:

> Andy Parkins wrote:
> > Hello,
> > 
> > I've got a big repository.  I've got two computers.  One has the repository
> > up-to-date (164M after repack); one is behind (30M ish).
> > 
> > I used git-fetch to try and update; and the sync took HOURS.  I zipped the
> > .git directory and transferred that and it took about 15 minutes to
> > transfer.
> > 
> > Am I doing something wrong?  The git-fetch was done with a git+ssh:// URL.
> > The zip transfer with scp (so ssh shouldn't be a factor).
> > 
> 
> This seems to happen if your repository consists of many large binary files,
> especially many large binary files of several versions that do not deltify
> well against each other. Perhaps it's worth adding gzip compression detecion
> to git? I imagine more people than me are tracking gzipped/bzip2'ed content
> that pretty much never deltifies well against anything else.

If your remote repository is fully packed in a single pack that should 
not have any impact on the transfer latency since no attempt to 
redeltify objects against each other is attempted by default when those 
objects are in the same pack.


Nicolas
-
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]