Re: Git performance results on a large repository

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Fri, 3 Feb 2012, Joshua Redstone wrote:

The test repo has 4 million commits, linear history and about 1.3 million
files.  The size of the .git directory is about 15GB, and has been
repacked with 'git repack -a -d -f --max-pack-size=10g --depth=100
--window=250'.  This repack took about 2 days on a beefy machine (I.e.,
lots of ram and flash).  The size of the index file is 191 MB.

This may be a silly thought, but what if instead of one pack file of your entire history (4 million commits) you create multiple packs (say every half million commits) and mark all but the most recent pack as .keep (so that they won't be modified by a repack)

that way things that only need to worry about recent history (blame, etc) will probably never have to go past the most recent pack file or two

I may be wrong, but I think that when git is looking for 'similar files' for delta compression, it limits it's search to the current pack, so this will also keep you from searching the entire project history.

David Lang
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html


[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]