Re: git clone: very long "resolving deltas" phase

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Fri, 9 Apr 2010, Matthieu Moy wrote:

> Vitaly Berov <vitaly.berov@xxxxxxxxx> writes:
> 
> > Objects amount: 3997548.
> > Size of the repository: ~57Gb.
> [...]
> > By the way, we have a large amount of binary files in our rep.
> 
> This is clearly not the kind of repositories Git is good at. I
> encourage you to continue this discussion, and try to find a way to
> get it working, but the standard approach (probably a "my 2 cents"
> kind of advices, but ...) would be:
> 
> * Split your repo into smaller ones (submodules ...)
> 
> * Avoid versionning binary files

I still think that Git ought to "just work" with such a repository.
There are things that should be done for that, like applying the 
core.bigFileThreshold configuration variable to more places, such as 
delta compression, object creation, diff generation, etc.

Of course Git won't be as good at saving disk space in that case, but 
when your repo is 57GB you probably don't care much if it grows to 80GB 
but cloning it is twice as fast.

Yet, I still don't think the current issue with the receiving end of a 
clone taking 6 hours in "Resolving deltas" is normal, independently of 
core.bigFileThreshold.


Nicolas
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]