Phillip Susi <psusi@xxxxxxxxxx> writes: > On 12/16/2010 4:19 PM, Nicolas Pitre wrote: > > What makes you think that unpacking them will actually make the access > > to them faster? Instead, you should consider _repacking_ them, > > ultimately using the --aggressive parameter with the gc command, if you > > want faster accesses. > > Because decompressing and undeltifying the objects in the pack file > takes a fair amount of cpu time. It seems a waste to do this for the > same set of objects repeatedly rather than just keeping them loose. Loose objects are also compressed. Besides git has some kind of delta cache, so when you are accessing a few objects (like e.g. when doing 'git log -p' - log + diff) you don't need to undeltify and uncompress the same objects repeatedly. Also in practice it is IO that is bottleneck, not CPU. And having many files is bad for filesystem cache. Originally packfiles were for the network transfer, but it turned out that they are better also as on-disk format. -- Jakub Narebski Poland ShadeHawk on #git -- To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html