Re: Removing old data without disturbing tree?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Tue, Nov 27, 2007 at 03:06:45PM -0500, Nicolas Pitre wrote:
On Tue, 27 Nov 2007, David Brown wrote:

An upstream tree I'm mirroring with git-p4 has decided to start checking
in large tarballs (150MB) periodically.  It's basically a prebuild version
of some firmware needed to run the rest of the software.

Git doesn't seem to have any problem with these tarballs (and is using a
lot less space than P4), but I have a feeling we might start running into
problems when things get real big.  Does anyone have experience with packs
growing beyong several GB?

It should just work. It was tested with artificial data sets but that's about it.

Now if those tarballs are actually multiple revisions of the same package, you might consider storing them uncompressed and let Git delta compress them against each other which will produce an even more significant space saving.

I did manage to talk them into leaving them uncompressed.  But, they are
large, and don't seem to delta compress all that well.  Maybe as more come,
the compression will be better.

I guess this will be a good test case...  It will probably take months or
even a year or so for the repo to get up to several GB.

David
-
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]

  Powered by Linux