Re: WG Review: Recharter of Hypertext Transfer Protocol Bis (httpbis)

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 





On 02/26/2012 12:44 AM, Mark Nottingham wrote:

On 26/02/2012, at 11:40 AM, Stephen Farrell wrote:


Mark,

I was going to respond blow-by-blow but there's not much
point in that, other than to say that your mail seems to
me a tad over the top.

Sorry if you think so. I'm VERY sensitive to the risks that we're undertaking here,

Fair 'nuff. I consider my self fairly insensitive generally
which of course means I don't care what others think about
that:-)

> and I want to be crystal-clear about what we're going to do.
> As I saw it, your path forward was a risky one.

I don't get that. I'd be interested in why it appears that way
to you and if there are ways to reduce risk but still try
make progress on security stuff.

(Maybe you misinterpreted me describing what might happen
as some kind of threat to try slow people down or something,
I don't know. I do know that I don't do that kind of thing
and if I tried I probably wouldn't be very good at it anyway;-)

It may not have been your intent; my concern was that the effect would be the same.

Don't get that either. I think I just pointed out the obvious
based on the proposed charter - at the point of the next proposed
re-charter, there'll be IETF/IESG review.

Anyway, I think we should focus on a way forward that attempts
progress on what we appear to agree is the desirable outcome:

They are VERY VERY VERY interested in security, as are many other
people in the broader community. If the IETF were to come up with a
viable proposal that solves their problems, they would beat down this
organisation's doors.

I proposed a plan that I think might allow us to make progress
on that. I believe we could.

OK, great.

Could you please explain why you think tying this effort to HTTP/2.0 is necessary to achieve that? To me that's the critical bit, and I still haven't seen the reasoning (perhaps I missed it).

That's a fair question that doesn't have a good and quick
answer, and some of the argument applies to the httpbis wg
and not to HTTP/2.0 per se.

Caveats: this is probably something that needs more bandwidth
than mail; most of the points below were already raised by
others on this thread (though I didn't go back through all
the mail yet) and this is not in any particular ordering.

- We've not really improved this in over a decade. Its time.
- The community's appreciation of better security has
  changed in that time as well so maybe its more tractable
  now and we've more experience of real attacks.
- Improving security after the fact is not a good plan.
- Thinking a separate security WG could provide an answer
  that'd be adopted seems less likely to work to some. (It
  does seem more likely to work for some others admittedly.)
- A backwards-incompatible change (if needed) could be
  done much more easily when changing HTTP. Its at least
  time to explore the area with that possibility in mind..
- A scheme less susceptible to phishing that got deployed
  could be very valuable. Its not ridiculous to think that
  might require breaking backwards compatibility somehow.

So, a bunch of things. Maybe none individually compelling.
But arguably taken together sufficiently convincing that
not attempting again to do something here would really be
inexcusable.

And yes, I do recognise that attempting to solve this does
add some risk. Most good things do.

S.


Thanks,



S



On 02/25/2012 11:25 PM, Mark Nottingham wrote:

On 26/02/2012, at 1:13 AM, Stephen Farrell wrote:

If we just need a new authentication scheme, nothing stops people from
working on that right now.

I don't agree with you there - the perceived low probability that
something will be deployed is a real disincentive here. We have had
people wanting to do work on this and have been told there's no point
because it won't get adopted.

Let's speak plainly here.

It's not hard for new things to be deployed -- there are lots of new things being deployed on the Web right now.

What IS hard is forcing browser vendors to deploy something that they don't think is going to work, or that they're not terribly interested in.

They are VERY VERY VERY interested in security, as are many other people in the broader community. If the IETF were to come up with a viable proposal that solves their problems, they would beat down this organisation's doors.

The IETF may have some smart people, and they may have ideas about Web security, but that doesn't necessarily make them into workable solutions for the various stakeholders. The discussion about Web security is much broader, involving not only the browser vendors, but also organisations like the W3C, IIW, etc.

Using HTTP/2.0 as a mechanism to shortcut all of this and force what people in the IETF think is a good solution down implementers' throats is going to lead to a very predictable and messy fail.


With this plan if httpbis in fact select zero new proposals
that would represent a failure for all concerned. The "zero
or more" term is absolutely not intended to provide a way to
just punt on the question.

This makes me very uncomfortable. How hard and long do we have to try before we convince you that we're not punting?


Such a failure at the point where httpbis was re-chartering
to work on a HTTP/2.0 selection with no better security than
we now have is probably better evaluated as a whole - I
guess the question for the IETF/IESG at that point would
be whether the Internet would be better with or without
such a beast, or better waiting a while until the security
thing did get fixed.


If we "wait a while until the security thing [does] get fixed", I'll gladly bet any amount of money you care to name that SPDY will gain market traction so quickly as to make any HTTP/2.0 effort in IETF completely backwards-looking. Refer to the debacle with Cookies for one such example.

So, again, I'm fine with allowing new authentication schemes to be proposed, in the off chance that some genius will suddenly propose something that meets all of the myriad requirements (that still aren't well-defined) and gets consensus and buy-in from implementers.

However, requiring us to output such a beast and gating HTTP/2.0 on it is effectively asking us to spin our wheels for n years. And the folks working on SPDY will - quite rightly - walk away and do their work somewhere more sane.

And I still don't see why it's necessary to do this all in one effort. Julian's point was that the only technical reason to combine authentication work with re-specifying HTTP is to make the new authentication scheme MTI. Otherwise, the work can be done separately. Roy has expressed interest in coming up with a new scheme that isn't necessarily targeted at browsers -- which is great, but I don't see why it has to be part of this WG.

We potentially have a LOT of work on our plate, and that work is channeling the energy behind a new serialisation of HTTP into something worthy of being called HTTP. Doing research and development on Web security is a very different kind of work, and the two don't mix well, IME.



--
Mark Nottingham   http://www.mnot.net/





--
Mark Nottingham   http://www.mnot.net/




_______________________________________________
Ietf mailing list
Ietf@xxxxxxxx
https://www.ietf.org/mailman/listinfo/ietf


[Index of Archives]     [IETF Annoucements]     [IETF]     [IP Storage]     [Yosemite News]     [Linux SCTP]     [Linux Newbies]     [Fedora Users]