Re: Large number of object files

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Wed, Oct 24, 2012 at 01:59:16PM +0700, Nguyen Thai Ngoc Duy wrote:

> On Wed, Oct 24, 2012 at 12:21 PM, Uri Moszkowicz <uri@xxxxxxxxx> wrote:
> > Continuing to work on improving clone times, using "git gc
> > --aggressive" has resulted in a large number of tags combining into a
> > single file but now I have a large number of files in the objects
> > directory - 131k for a ~2.7GB repository.
> 
> Can you paste "git count-objects -v"? I'm curious why gc keeps so many
> loose objects around.

Presumably ejected from the pack because they are now unreachable.
That's a rather large number, but if there was recent ref maintenance
(e.g., deleting branches or tags), it is not impossible.

> > Any way to reduce the number of these files to speed up clones?
> 
> An easy way to get rid of them is to clone the non-local way.
> Everything will be sent over a pack, the result would be a single pack
> in new repo. Try "git clone file:///path/to/source/repo new-repo".

If you have git v1.7.12 or greater, you can also use the "--no-local"
option to clone. But as you mentioned, pruning is probably the most
sensible thing (and for a non-local clone, those objects should not
impact performance at all, as we will never even look at unreferenced
objects).

-Peff
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html


[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]