Nicolas Pitre <nico@xxxxxxx> wrote: > I would have concatenated all packs provided on the command line into a > single one, simply by reading data from existing packs and writing it > back without any processing at all. The offset for OBJ_OFS_DELTA is > relative so a simple concatenation will just work. > > Then the index for that pack can be created just as easily by reading > existing pack index files and storing the data into an array of struct > pack_idx_entry, adding the appropriate offset to object offsets, then > call write_idx_file(). > > All data is read once and written once making it no more costly than a > simple file copy. On the flip side it wouldn't get rid of duplicated > objects (I don't know if that matters i.e. if something might break with > the same object twice in a pack). Yea, that's a really quick repack. :-) Plus its actually something that can be easily halted in the middle and resumed later. Just need to save the list of packfiles you are concatenating so you can pick up later when you get more time. There shouldn't be a problem with having duplicates in the packfile. You can do one of two things: a) Omit the duplicates from the .idx when you merge the .idx tables together to produce the new one. Just take the object with the earliest offset. b) Leave the duplicates in the final .idx. In this case the binary search may pick any of them, but it wouldn't matter which it finds. About the only process that might care about duplicates would be index-pack. I don't think it makes sense to run index-pack on a packfile you already have a .idx for. I don't think it would have a problem with the duplicate SHA-1s either, but it wouldn't be hard to make it do something reasonable when it finds them. > > To consolidate all packs that are smaller than a megabytes into > > one, you would use it in its current form like this: > > > > $ old=$(find .git/objects/pack -type f -name '*.pack' -size 1M) > > $ new=$(echo "$old" | git merge-pack | git pack-objects pack) > > $ for p in $old; do rm -f $p ${p%.pack}.idx; done > > $ for s in pack idx; do mv pack-$new.$s .git/objects/pack/; done > > You might want to move the new pack before removing the old ones though. Not might, *must*. If you delete the old ones before the new ones are ready then readers can run into problems trying to access the objects. We've spent some effort trying to make these sorts of operations safe. No sense in destroying that by getting the order wrong here. :) -- Shawn. - To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html