Benjamin LaHaise <bcrl@xxxxxxxxx> wrote: > Hi folks, > > Doing a fresh git clone git://some.git.url/ foo seems to download the > entire remote repository even if all the objects are already stored in > GIT_OBJECT_DIRECTORY=/home/bcrl/.git/object . Is this a known bug? > At 100MB for a kernel, this takes a *long* time. I believe it is a known missing feature. :-) git-clone doesn't prep HEAD to have some sort of starting point so the pull it uses to download everything literally downloads everything as nothing is in common. One could work around it by running git-init-db to create the new clone locally, git-update-ref HEAD to some commit which you have in common with the remote, create a origin file, then perform a git-pull. This would only download the objects between the commit you put into HEAD and the current master of the remote... But that is actually some work. I think Cogito's clone is capable of restarting a failed clone; I wonder if that logic would benefit you here? Is using a common GIT_OBJECT_DIRECTORY across many clones actually pretty common? Maybe its time that git-clone gets some more smarts with regards to what it yanks from the origin. -- Shawn. - : send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html