Re: Call for Community Feedback: Retiring IETF FTP Service

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Title: Re: Call for Community Feedback: Retiring IETF FTP Service

On 11/30/20 12:07 PM, Theodore Y. Ts'o wrote:

Changed once or twice over years or decades is not "constant flux".

The aggregate of changes to many different interfaces or services or tools, each changing once or twice over a few years, may look a lot like constant flux.

(what's the saying?  "No raindrop thinks it is to blame for the flood.")

Stepping back a bit, it seems to me that that one of the hidden
assumptions which has turned this into a long, drawn-out discussion,
is whether or not change should be tolerated, and over what time
scales.  It may very well be that for some people a change once a
decade is "contant flux".  In other cases, people may be willing to
accept some change every 2-4 years, so long as reasonable methods of
getting their work done are available, even if it does mean that they
have to adjust their workflow every few years.

Something I've consistently found throughout my career is that people don't like to change from tools and interfaces that work well for them, to new tools that require different habits of working.   For example, I knew professors who clung to VMS MAIL for years even though it was never a great tool for managing Internet mail and even gmail's web interface (as much as I personally loathe it) was arguably much better.  I first got James Gosling's emacs to run on VMS in 1980, and I've been using the same key bindings ever since.   I've adapted more editors than I can count to use those key bindings.   I vastly prefer aircraft instrument panels with "steam gauges" to modern instruments that try to cram all information into one tiny display and are also more difficult to read (the old instruments were very carefully designed to maximize readability even in low light conditions, part of which meant they were relatively uncluttered).   etc.

IMO, that reluctance to change interfaces is entirely understandable.   People literally invest years or decades into learning to work effectively with a certain set of tools, and trying to use new tools sometimes drastically affects those people's ability to work effectively.   (And sometimes the people get blamed for this.) 

Basically I think there needs to be very good reasons to force people to abandon tools and interfaces that work well for them.  And it seems that there's a very unfortunate and widely-held belief that "newer is better" that simply is not true in the general case (e.g. does not withstand rigorous measurement).

Ultimately, if we have to support all workflows forever, then the job
of the people maintaining the tools and servies is going to either (a)
stagnate, by not adding new features, since that will increase their
maintenance load, or (b) grow without bound, as new features to ease
IETF participants' work are added, with no means of transitioning off
of older technologies.

There is indeed a problem here, but part of the problem is a constant demand for "new features".   It's not clear to me where that demand is coming from.   It seems to me that it's often better to design stable interfaces that won't need much change over time (because people don't like to change user interfaces) than to keep changing things.   Every time Windows changes versions and user interfaces, I hear my Windows-using colleagues gripe about how inconvenient the changes are.   The other day I needed to format a USB stick exactly as Windows would do so, so I asked a friend if I could use her Windows machine to do that.  It took the two of us about two hours to figure out how to do it, because none of the old user interfaces worked and the user interfaces that did exist had changed all of the terminology to the point that we couldn't tell for sure exactly what the PC was going to do.   And all I needed was to initialize a USB stick with a single partition and a single Windows file system on that partition.  

If I look at old automobiles there are lots of different ways of operating them.   But fairly soon in the development of automobiles, a fairly common user interface seemed to emerge.   I don't know of any modern car that is steered with a lever.   Even though newer automobiles have some newer features the basic driving interface has mostly stayed compatible since the 1930s or so.   Maybe we're just still in the early phase of Internet user interfaces and things haven't settled out yet.

I suspect the people who are so concerned about FTP overheads are
doing so for philosophical reasons, more than anything else.  

I don't think it's "philosophical" to want to have stable and effective interfaces.  I think it's human nature.

But the fundamental nature of engineering is taking components with known and predictable characteristics (because they are designed, tested, and/or selected to have those characteristics), and assembling reliable systems out of those predictable components.  Without predictable components, the whole discipline of engineering falls apart and becomes basically guesswork.   So a lot of us understand at a very deep level that when you start trying to use components that don't have predictable behavior, things break. 

From that point of view valuing predictable and stable services is not philosophy, it's reality.

And this kind of breakage seems to be happening with increasing frequency.   For example, it used to be that applications could count on the network making a best effort to deliver packets intact from source to destination.  Since the network was making a best-effort, there was no need to second-guess the network.   Nowdays, applications cannot depend on that happening.   There are middleboxes in the network that try to second-guess the applications (whether to "improve" performance or to enforce restrictions or whatever)   And then the applications have to second-guess the network and try to work around the damage.   That's a classic tussle which we're all familiar with.  No matter how it's resolved (if it even is), does not promote the development of reliable, predictable systems.


For
while, when my preferred access method was over AFS, I was mirroring
FTP and I-D's to a local archive on an AFS cell at MIT.  It was *not*
a big deal, and if I needed to change the URL used to keep my local
mirror in sync (as I recall I needed to do once or twice over the 10
or so years), it really wasn't a big deal.  And disk space has gotten
cheaper over the years, so it *really* isn't that hard for people to
keep their own local mirrors if they really wanted to.

Unless, perhaps, you're using one of those new very thin notebooks with a small amount of flash memory, or you're trying to work from your phone or tablet.   Just because my last N laptops have had at least 1TB drives in them, doesn't mean the next one will.   (Apple wants a LOT of money for an M1 laptop with a 1TB drive.)

But I think this is missing the point.   It's not just that people may need to change sources or protocols - though in the aggregate that is a problem - but also that the new protocols being proposed and interfaces presumed are less functional than the old ones in important ways, and their behavior is also less predictable.

If IETF said, for example, we're going to change from FTP to WebDAV as the means to provide remote file access to our documents, some of us would adapt.   We'd write new tools or modify existing tools if we had to.   The same would be true if IETF decided to use anonymous NFS, or anonymous CIFS, or sshfs, or FTPS, or whatever.   But what we were essentially told is that we'd have to completely do without such an interface and (at least in the initial message) that the decision had already been made (it was in the past tense), and that the powers that be had decided this based on extremely flawed analysis and basically a disregard for any use of the existing FTP server other than mirroring. 

Note also that any change at all will still probably deter some users to the point that they stopped participating in IETF or reduced their participation significantly.   That's part of the cost of changing interfaces, and needs to be considered as such.   And unlike some, I think that people who have habituated to familiar workflows and interfaces should not be disregarded out-of-hand as if they were irrelevant.

More broadly,  anyone who thinks that they can predict how other people should behave under changing conditions is likely to be surprised, even moreso when they think they have a right to demand that people adapt as they expect.

So I wonder if this whole, long, debate, is really more about people
who don't want to deal with any kind of change, because while *this*
change might be relatively easy to work around, the *next* one might
require a bit more work.  But the long-term question about how access
portals and other technologies should get retired is still going to
remain.

Well, every change is an unknown.   You don't know how much trouble it's going to be, but potentially any change might be a lot of trouble and very disruptive to other things you need to do.   For that reason alone there's a tendency to avoid such changes.

As for the long-term question:   IMO the thing to do is to very consciously pick interfaces (machine and user) that are standardized, functional, and can be stable over long periods of time, so that there's less frequently a need to consider such questions.   And sure, once in a great while there's an unavoidable need for change, and when that happens, those transitions need to be managed, generally with plenty of advance notice and overlap.

Perhaps if there was a deprecation window of, say, a year?  Maybe two
years?  This gives people *plenty* of time to investigate alternate
workflows and methods, and still allows for the secretariat and tool
teams to be able to continue to innovate without having to maintain
older mechanisms forever.

Certainly I think a transition window is often better than a hard cutoff.

Keith

(I still suspect it might make more sense overall to fix FTP than to abandon it.   But part of that is based on a realization that FTP is still the tool of choice for some circumstances and usage scenarios that needs to be maintained; and also a belief that IETF should eat its own dog food whenever it makes sense.  But it might also be the case that say WebDAV is better for IETF's document access purposes even if FTP continues to be useful elsewhere, and dog food consumption is not the most important thing in this case.   I just don't think the dog food consumption should be disregarded out-of-hand - it does affect both our reputation and the quality of our work. )



[Index of Archives]     [IETF Annoucements]     [IETF]     [IP Storage]     [Yosemite News]     [Linux SCTP]     [Linux Newbies]     [Mhonarc]     [Fedora Users]

  Powered by Linux