Re: Cloning from sites with 404 overridden

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



"Marco Costalba" <mcostalba@xxxxxxxxx> writes:

> http://digilander.libero.it /mcostalba/scm/qgit.git/objects/8d/ea03519e75f47d
>
> Git does not understand object is missing and thinks what site sends
> _is_ the requested
> object and then founds that is (of course) corrupted.

To be fair, the site is _not_ missing anything from HTTP
protocol perspective, because when git asks 8d/ea0351... file,
the server responds with a regular "HTTP/1.0 200 OK" response.
So it is _your_ repository that is corrupt -- instead of
correctly _lacking_ the file you should have removed with
prune-packed, it has a garbage file.

Having said that, I agree that it would be nicer if we support
such a site, in the same spirit that we already bend backwards
to support really dumb hosted http servers that do not give
directory index by using objects/info/packs and info/refs.

I think it wouldn't be too much a hassle to add logic to
http-fetch.c (perhaps with an additional "--no-404" option or
somesuch) to fall back on pack transfer upon seeing a corrupt
loose object.  We do the falling back when getting 404 error to
a request for a loose object, so the new code would essentially
do the same and you might be OK.



-
: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]