On Fri, 10 Apr 2009, Robin H. Johnson wrote: > On Wed, Apr 08, 2009 at 12:52:54AM -0400, Nicolas Pitre wrote: > > > http://git.overlays.gentoo.org/gitweb/?p=exp/gentoo-x86.git;a=summary > > > At least that's what I cloned ;-) I hope it's the right one, but it fits > > > the description... > > OK. FWIW, I repacked it with --window=250 --depth=250 and obtained a > > 725MB pack file. So that's about half the originally reported size. > The one problem with having the single large packfile is that Git > doesn't have a trivial way to resume downloading it when the git:// > protocol is used. Having multiple packs won't help the git:// protocol at all in that regard. In fact it'll just make it a bit harder on the server for all cases, which has to generate a single pack for streaming anyway by using multiple source ones and perform extra work in attempting delta compression across pack boundaries. > For our developers cursed with bad internet connections (a fair number > of firewalls that don't seem to respect keepalive properly), I suppose > I can probably just maintain a separate repo for their initial clones, > which leaves a large overall download, but more chances to resume. I don't know much about git's http protocol implementation, but I guess it should be able to resume the transfer of a pack file which might have been interrupted in the middle? If no then this should be considered. > PS #1: B.Steinbrink's memory improvement patch seems to work nicely too, > but more memory improvements in that realm are still needed. Good. > PS #2: We finally got some newer hardware to run the large repo, I'm > working on the install now, but until the memory issue is better > resolved, I'm still worried we might run short if there are too many > concurrent clones. Right. Nicolas (who wishes he could dedicate more time to git hacking) -- To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html