Re: Leaving large binaries out of the packfile

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



 ----- Original Message -----
From: Shawn O. Pearce
Date: 6/10/2010 12:04 PM
Joshua Jensen<jjensen@xxxxxxxxxxxxxxxxx>  wrote:
Sometimes, 'git gc' runs out of memory.  I have to discover which file
is causing the problem, so I can add it to .gitattributes with a
'-delta' flag.  Mostly, though, the repacking takes forever, and I dread
running the operation.
If you have the list of big objects, you can put them into their
own pack file manually.  Feed their SHA-1 names on stdin to git
pack-objects, and save the resulting pack under .git/objects/pack.

Assuming the pack was called pack-DEADC0FFEE.pack, create a file
called pack-DEADC0FFEE.keep in the same directory.  This will stop
Git from trying to repack the contents of that pack file.

Now run `git gc` to remove those huge objects from the pack file
that contains all of the other stuff
Pardon the late response.

This method can work, but it is a manual process. I am interested in a method where Git can make the determination for me based on a wildcard and flag from .gitattributes.

I am still playing with the feature within a multi-gigabyte repository with lots of large binaries. I'll post more about it when some additional changes have been made.

Thanks!

Josh
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html


[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]