How strange, I have always found FTP to be an absolute dog of a protocol precisely because it mandates idiot defaults and implementations are required to perform heuristic hacks. I always used to have to transmit files twice because the first time the transfer would be corrupted by the default being 'trash my data in case I am using an IBM machine'.
The separation of the command and data is a good one but separating it over separate TCP/IP streams is an utter disaster for reliability. None of the FTPS clients I have found for Windows is able to reliably transmit a Web site update.
There are much better options, rsync and STFP (SSH FTP) do work well but unfortunately the hosting provider does not support them which is why I am leaving my current provider when I get round to it.
FTP to HISTORIC. The time has come.
On Thu, Sep 28, 2017 at 6:45 PM, John C Klensin <john-ietf@xxxxxxx> wrote:
--On Thursday, September 28, 2017 09:57 +0100 "tom p."
<daedulus@xxxxxxxxxxxxx> wrote:
> The obvious one, which disrupts my work, is the Date Created.
> FTP gives me date which is, or is close to, the creation date
> of the RFC by the RFC Editor.
>
> Internet Explorer makes the Date Created the date on which I
> perform the download, which may be years later and so
> thoroughly misleading (to me).
>
> So standalone FTP every time.
Because FTP, by design, has a command in the protocol to
transmit a data type (primitive version of content type) and a
canonical form for text on the wire, competent implementations
are also capable of delivering files with EOL conventions, and
even character encodings, appropriate to the receiving
environment. We learned lessons a _very_ long time ago from
EBCDIC and two (or, depending on how you count, at least four)
different encoding forms for ASCII that lead to that feature and
the TYPE command.
Perhaps unfortunately from where we stand today, the community
effectively decided to discard that feature, with a number of
client implementations deciding that binary transfers (in
FTP-speak, TYPE I) were enough that that receiving systems
should just get exact copies of whatever the sending system had
and sort it out themselves, in the process ignoring the FTP
requirement [RFC959, Section 4.1.2, "Representation Type"] that
the default in TYPE is not specified is ASCII non-print. While
TFTP [RFC1350] has a similar feature ("netascii" mode), AFAIK,
other FTP alternatives for transferring data under different
conditions, including SFTP over SSH and Rsync, simply assume
image copies are fine.
Trying to transfer files containing non-ASCII characters makes
the problem worse because, while the spec isn't explicit about
it, an FTP implementation should presumably fail if TYPE A is
used and the contents of the data file cannot be interpreted as
ASCII. Attempts to add a "TYPE U" (for "Unicode" or "UTF-8") to
FTP to solve that problem for another canonical text
representation have gotten absolutely no problem, leading me to
presume that the community has completely lost interest in these
issues.
Someone who believed in the existence of character coding and
transmission deities and divine retribution from them might
conclude that the community deserves this BOM mess, along with
UTF-16 on the wire, as a result of not dealing with the issue
effectively in FTP, TFTP, and a variety of other transfer
measures. I couldn't possibly comment on that.
john