Gary V. Vaughan wrote: > On Wed, Apr 28, 2010 at 04:51:59AM -0500, Jonathan Nieder wrote: >> The most general way: [git init and git add . and git commit] > D'oh. Of course... I was too fixated on git clone to notice. > >> You can automate some of those steps by >> >> wget http://address/of/tarball.tar.gz >> git init project >> cd project >> perl /usr/share/doc/git/contrib/fast-import/import-tars.perl tarball.tar.gz >> git checkout import-tars >> ... use git as usual > > What's happening here? Is this sharing a single repository for all > locally hosted git projects, or is this more or less the same as the > above? It’s the same as the above. The only advantages I can think of are that it might be slightly faster (though I haven’t tested) and that this way I don’t have to remember the tar --strip-components option. You could share a single repository for all locally hosted git projects, but that would kill the behavior of make. >> If upstream uses git, there is also the shallow-clone facility: >> >> git clone -b master --depth=1 git://repo.or.cz/git.git/ >> cd git >> ... use git as usual, except history is cauterized > > This is probably the flavour that would be of the most use to us. > Thanks for educating me :) > >> It has one rough edge you may run into: push is not supported. If that >> is a problem for you, let me know and maybe I can try to help fix it. > > No, I think the main benefit of using git locally would be to provide > a pull source for upstream. Oh, that’s another rough edge (the same one, fundamentally). Sorry. Another benefit of using git to fetch from upstream is to get the latest version. Hope that helps, Jonathan -- To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html