Re: Repacking many disconnected blobs

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 




On Wed, 14 Jun 2006, Linus Torvalds wrote:
> 
> You don't _need_ to shuffle. As mentioned, it will only affect the 
> location of the data in the pack-file, which in turn will mostly matter 
> as an IO pattern thing, not anything really fundamental.  If the pack-file 
> ends up caching well, the IO patterns obviously will never matter.

Actually, thinking about it more, the way you do things, shuffling 
probably won't even help.

Why? Because you'll obviously have multiple files, and even if each file 
were to be sorted "correctly", the access patterns from any global 
standpoint won't really matter, becase you'd probably bounce back and 
forth in the pack-file anyway.

So if anything, I would say

 - just dump them into the packfile in whatever order is most convenient

 - if you know that later phases will go through the objects and actually 
   use them (as opposed to just building trees out of their SHA1 values) 
   in some particular order, _that_ might be the ordering to use.

 - in many ways, getting good delta chains is _much_ more important, since 
   "git repack -a -d" will re-use good deltas from a previous pack, but 
   will _not_ care about any ordering in the old pack. As well as 
   obviously improving the size of the temporary pack-files anyway.

I'll pontificate more if I can think of any other cases that might matter.

		Linus
-
: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]