On Tue, Nov 17, 2020 at 12:25:07AM -0500, Keith Moore wrote: > What I'm seeing are a lot of handwaving arguments and very little detail, > and a lot of arguments of the form "it's quite straightforward for all > clients to rewrite their code so we don't have to run a single additional > server". So I don't really care *all* that much, but to be fair, there have been some hand-waiving arguments on both sides. The most common way that *I* would write scripts to download files, going back at least ten years, using ftp is to use either curl or wget --- and both of those scriptable, command-line tools support both http and ftp. So transitioning such scripts to use http versus ftp is not hard, and given the problems with NAT boxes, I would have transitioned over my scripts using wget or curl to use http a long time ago. And while it is true that the http *protocol* has been evolving, it is *not* a "user interface"; it is a protocol which supports user interface programs, and as a protocol there has been quite good backwards compatibility, since people (especially in government) have insisted on running Windows 95 long past when it was safe and sane to do so (speaking as someone whose SF-86 was compromised as a result), and so web servers will support ancient http clients without any issues that I'm been made aware of. So the argument that http is "unstable" and so we should stick with supporting a file transfer protocol invented in the 197o's because it is somehow more "stable" is also, in my opinion, full of hand-waving. Cheers, - Ted