Jan Hudec wrote:
Hello,
Did anyone already think about fetching over HTTP working similarly to the
native git protocol?
That is rather than reading the raw content of the repository, there would be
a CGI script (could be integrated to gitweb), that would negotiate what the
client needs and then generate and send a single pack with it.
Mercurial and bzr both have this option. It would IMO have three benefits:
- Fast access for people behind paranoid firewalls, that only let http and
https (you can tunel anything through, but only to port 443) through.
- Can be run on shared machine. If you have web space on machine shared
by many people, you can set up your own gitweb, but cannot/are not allowed
to start your own network server for git native protocol.
- Less things to set up. If you are setting up gitweb anyway, you'd not need
to set up additional thing for providing fetch access.
Than a question is how to implement it. The current protocol is stateful on
both sides, but the stateless nature of HTTP more or less requires the
protocol to be stateless on the server.
I think it would be possible to use basically the same protocol as now, but
make it stateless for server. That is server first sends it's heads and than
client repeatedly sends all it's wants and some haves until the server acks
all of them and sends the pack.
Alternatively I am thinking about using Bloom filters (somebody came with
such idea on the bzr list when I still followed it). It might be useful, as
over HTTP we need to send as many haves as possible in one go.
Bundles?
Client POSTs it's ref set; server uses the ref set to generate and
return the bundle.
Push over http(s) could work the same...
-
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at http://vger.kernel.org/majordomo-info.html