On 13 June 2017 at 18:50, Christian Huitema <huitema@xxxxxxxxxxx> wrote: > Then there is grease, or greasing, which is a somewhat recent > development. The idea is to have some implementations forcefully > exercise the extension points in the protocol, which will trigger a > failure if the peer did not properly implement the negotiation of these > extensions, "grease the joints" in a way. That's kind of cool, but it > can only be implemented by important players. If an implementation has > 0.5% of the market, it can try greasing all it wants, but the big > players might just as well ignore it, and the virtuous greasers will > find themselves cut off. On the other hand, if a big player does it, the > new implementations had better conform. Which means that greasing is > hard to distinguish from old fashioned "conformity with the big > implementations", which might have some unwanted consequences. Should it > be discussed? I think the pressure of large deployments on smaller ones works in both good, and more often bad, ways. Within the XMPP community, we saw this multiple times. Facebook and Google both deployed XMPP services over the years, and both were larger than the rest of the community put together. You've described this case as "important players", but I'd prefer to simply describe them as large. Facebook's involvement was restricted to client/server (C2S) links, and in general worked reasonably well, though various custom extensions were used that performed similar functions to standardized ones. Google's service did not support SRV lookups, and mandated the (at the time) legacy immediate-mode TLS instead of XMPP's standard Start TLS, for example. This was by design, apparently for security (though likely for deployment considerations), but had the effect that client developers often had to hardcode server discovery for the service. Google did provide S2S peering, but - paradoxically given the above - did not operate TLS at all, preventing any other service from mandating TLS for several years. When Google withdrew support for the service (about four years ago with the introduction of Hangouts), the community almost universally switched to mandatory TLS within a few months. At the same time, Google's S2S peering made heavy use of multiplexing (known as piggybacking within XMPP), which was unusual at the time, and could easily cause problems with servers - these servers were simply forced to update. One might consider this case "virtuous greasing". One can very easily look at DMARC as another example of larger players versus small - DMARC obviously fails to work in the mailing list case (amongst others), yet the large deployments simply don't care, since it solves their problems, and to hell with everyone else. Perhaps standards simply work best in a balanced community?