Re: Errors cloning large repo

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 




On Sat, 10 Mar 2007, Geert Bosch wrote:
> 
> Larger packs might still be sent over the network, but they
> wouldn't have an index and could be constructed on the fly,
> without ever writing any multi-gigabyte files to disk.

I have to say, I think that's a good idea. Rather than supporting a 64-bit 
index, not generating big pack-files in general is probably a great idea.

For the streaming formats, we'd obviously generate arbitrarily large 
pack-files, but as you say, they never have an index at all, and the 
receiver always re-writes them *anyway* (ie we now always run 
"git-index-pack --fix-thin" on them), so we could just modify that 
"--fix-thin" logic to also split the pack when it reaches some arbitrary 
limit).

Some similar logic in git-pack-objects would mean that we'd never generate 
bigger packs in the first place..

It's not that 64-bit index file support is "wrong", but it does seem like 
it's not really necessary.

		Linus
-
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]