On 2009.01.27 19:33:11 -0800, Junio C Hamano wrote: > It can be argued that at least in the "real ref" case you are in control > of both ends and if you have a disconnected chain in your local repository > that you do not have a ref for, you are screwing yourself, and it is your > problem. But when you forked your repository from somebody else on a > hosting site like github, you do not have much control over the other end > (because it is a closed site you cannot ssh in to diagnose what is really > going on), and if you do not exactly know from whom your hosted repository > is borrowing, it is more likely that you will get into a situation where > you may have objects near the tip without having the full chain after an > aborted transfer, and the insufficient check of doing only has_sha1_file() > may become a larger issue in such a settings. Uhm, it might be obvious, but what exactly could go wrong? Do we need to fetch from multiple repos when alternates are involved? Or how would we end up with a broken chain? I mean, it starts to make some sense to me why we would need the connectivity check, but how do we end up with a "partial" fetch at all? > I'd prefer a small helper function to consolidate the duplicated code, > like the attached patch, though. How about doing it like this? Yeah, that looks a lot nicer :-) Björn -- To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html