Re: Revisiting large binary files issue.

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 




On Mon, 10 Jul 2006, Carl Baldwin wrote:
> 
> When I set the window to 0 I one more issue.  Even though the blobs are
> already compressed on disk I still seem to pay the penalty of inflating
> them into memory and then deflating them into the pack.  When the window
> size is 0 this is just wasted cycles.  With large binary files these
> wasted cycles slow down the push/fetch operation considerably.  Couldn't
> the compressed blobs be copied into the pack without first deflating
> them in this 0 window case?

The problem is that the individual object disk format isn't actually the 
same as the pack-file object format for one object. The header is 
different: a pack-file uses a very dense bit packing, while the individual 
object format is a bit less dense.

Sad, really, but it means that right now you can only re-use data that was 
already packed (when the format matches).

		Linus
-
: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]