Re: Oauth blog post

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Whose library? (rhetorical question).

In my experience, the issue is pretty straight forward and its what this OAUTH fellow exemplified - technology leaders taking control of a standard for their strategic benefit. This is not a phenomenon, its par for the course and its a principle reason why I have mandated since the 80s that we (SSI) do not get locked in on any standard protocol, with PPP, RADIUS, TELNET, FTP, SMTP, POP3, HTTP and others our product are strategically based on. This is not odd and product/project managers who don't recognized protocol(s) apply to the benefit of their company and product lines is well, not doing their company any great service. The IETF is suppose to be the watchdog for protocol standards changing on the community and more often you are seeing the mantra of "Who's Who" or basically the old GM theory - "Whats good for GM, is good for ...." changing standards or pseudo developed standard protocols.

Overall, we have a conflict of interest and if anyone who does not have any particular interest in a protocol, makes you wonder why one would be involved or are even carrying a corporate badge. I for one will not appreciate an employee doing work that may have a direct or indirect conflict of interest and would be pissed if he didn't recognized it. Thats not odd or evil. Thats par for the course.

Which brings us to the "Whose Library?" question. If one is to follow market and technology leaders, does that mean you are using their library as well? I'm sure that does not always fit with all corporate IP policies, and more controlled so in the past with in-house developed source code, than it does today with all the globalization and participation going on. Look at the mess that (open source) created! It all fun when you're young, but one day you have to bring home the bacon.

I think it is a mistake to assume that using a Library is the answer or rather is something that defines a protocol direction and the people that define it. One question might be what language is this in? Would it be C/C++? or some perl scripting language? Some other? What if the organization has its own strategic "added value" language for its customers? You don't have to go far than just use an example of .NET? What if the OAUTH Library was offered only in .NET. I'm sure the *nix world would not be jumping with joy there. All this and I've hardly touch based with many more reasons to be careful of these protocol definition/direction issues, why one has be very careful and also if they can, be active, if only just vocal and watchful when need be of people, in particular those who are very active with the IETF process taking control and advocating a method in a direction that can have a major/negative impact on you. Its a tough issue because decisions do need to be made but its getting harder to swallow the "Rough Consensus" stick being especially when its a highly subjective issue involved.

In any case, I believe the IETF should take heed of what has occured with the OAUTH incident. I don't believe it is an isolated issue which I am sure many here believe. I am not sure what all can be learned other than possibly the IETF needs to be more watchful of standards moving away from a network wide community open standard to one that only feasibly benefits a much smaller set, abeit larger entities.

At the end of the day, its all about cost, yet, while a library may help address that, that is not always the best solution unless we do want to have specific market and technology leaders in control of a particular protocol w/o a library. In my view, the community MUST have an IETF around to be watchful of the standards from being strategically changed on people. It assume a level of equal capability by all to stay on par or get out of the way and that presents conflict of interest and possible anti-trust. The GM theory does not always apply.

--
HLS

Hannes Tschofenig wrote:
In the identity management case we are not necessarily talking about solutions that are "good" or "bad". The issue is that certain people care about one use case and other people care about other use cases. I use the term "use case" in a generic sense to also include certain deployment assumptions (e.g., has to work with existing programming languages, deployment environments), or design themes (e.g., XML vs. JSON). So, for example, in the OAuth case there are people who care a lot about different Websites sharing data between each other (the photo sharing / photo printing use case). Again others think that the smart phone use case is more important. The solutions for these two cases are slightly different (because they can rely on different assumptions). Initially, people start with a single use case they care about. The work gets attention and other people start to use the protocol as well and notice that it does not meet their use cases. So, they add functionality. Over time the set of specification becomes more complex and a beginner does not see through the specification jungle anymore. Then, these newcomers start from scratch to fix all these "complex protocols". Typically, these persons like to reject any idea that was done in the past (such as learning from the experience the previous generation had made). The cycle starts from the beginning. We went through these cycles several times already in the identity management world.

I believe that application developers shouldn't even worry about the details of the protocol suite. They should be using a library instead. We use libraries all the time and particularly with security protocols. Take TLS as an example. No application developer would come up with the idea to write their own TLS stack either. They let security professionals write those libraries.






[Index of Archives]     [IETF Annoucements]     [IETF]     [IP Storage]     [Yosemite News]     [Linux SCTP]     [Linux Newbies]     [Fedora Users]