Comments below. > On Dec 3, 2021, at 09:12, Brian Rosen <br@xxxxxxxxxxxxxx> wrote: > > There is a user who is sitting in front of some client > There is a provider who has implemented the RUE server interface. > > We want to allow the user to be able to use a WebRTC client. When they do, there us a third piece: a gateway that has a WebRTC server on one side and a RUE client interface on the other. Unlike a real WebRTC situation, the other end isn’t necessarily a WebRTC client, and in particular, may be just a SIP client. > > A provider is free to implement a direct WebRTC interface to their service. That would not conform to the RUE interface, and our document is not intended cover that case. [BA] This is helpful. It might make sense to add a Section to the document, explaining the model for WebRTC support. I also think you might want to clarify at various points that the requirements are for the “RUE” side, not for the WebRTC interface. > We want to allow ICE to be implemented, but not necessarily used. Would it be acceptable to note that in the exceptions for 8835 conformance? [BA] This is where explaining the model and separating the requirements might be helpful. ICE is both mandatory to implement and mandatory to use for WebRTC. So the WebRTC side of the gateway would need to support it. But for the “RUE” side that is something that can be negotiated in SIP. > I don’t see anything in 8835 that requires dual stack. [BA] 8835 does require being able to handle IPv6 addresses and candidates. But again, those requirements would apply to the WebRTC side of the gateway, not necessarily to the “RUE” side. >> As above, there would need to be a gateway between a WebRTC client (which would be using the data channel to send RTT) and the RUE client interface. That gateway would need to unwrap the RTT from the data channel, wrap it in RTP per RFC4103 and vice versa. We recognize the ugliness of this solution. I was not involved in the development of WebRTC and I don’t understand why it does why it uses the data channel, but I have been dealing with the consequences of that in several venues, and it’s a big problem. There is increasing deployment and use of RTT and it all uses RFC4013. But it is what it is, and the gateway will have to interwork. It has been one of the largest challenges in the experimental implementation that is underway for RUE. Is there some text change we need? [BA] The text change would be to explain the interworking model. > If we left this alone, it would mean that ICE-lite couldn’t be used with a non WebRTC client. We probably have to allow a RUE client to use ICE-light, but the server can’t assume it. I will address that. Both sides have to support full ICE, but we don’t require that is be used. Is there another problem I’m not seeing? [BA] The requirement for full ICE is on a WebRTC client. The WebRTC gateway could support ICE-lite. The “RUE” side could negotiate ICE in SIP. >> [BA] Is RUE is really a WebRTC "non-browser" endpoint? If the goal is to allow >> RUE to be easily built on top of native WebRTC libraries such as libwebrtc or >> pion, then it should inherit WebRTC requirements such as ICE, dual stack >> support, etc. > There really isn’t an exact fit with non-browser endpoint, but it was the closest we could get. > So we need to highlight exceptions. The server should be able to handle a client that offers full ICE. [BA] I would remove the statement that RUE is a WebRTC “non-browser” endpoint. It isn’t. The gateway model you’ve described allows full WebRTC browser and non-browser endpoints to be supported, as well as “RUE” endpoints. They have different capabilities and requirements. >> [BA] Since MediaStreamTracks are how audio/video is obtained from devices (or >> rendered), I don't understand how a WebRTC application (browser or non-browser) >> can function without them. Elsewhere, the specification states that the data >> channel isn't used, now it seems to say that audio/video isn't used either. > I may have been too broad in what we were attempting to do. We definitely are not using JSEP or Media Capture and Streams API. So there isn’t any real manifestation of a MediaStreamTrack. There is just the RTP and associated SDP. Could you suggest how we might describe that better? [BA] If you explain the gateway model, then it will be clear that you can fully support WebRTC browser and non-browser endpoints via the gateway. With respect to the RUE side, you don’t really deal with MediaStreamTracks at all because the RUE client isn’t a WebRTC endpoint, so it doesn’t need to support WebRTC APIs ,JSEP, MSTs, etc. > There clearly is an error here. I tried to track back to see where/how I created it. The reference should be to Section 5, SDP Identity. We can’t use most of the rest of 8827 because it’s about WebRTC signaling and consequences of that. All we can pull in is the Identity SDP exchange. We have a lot of the meat of what is in 8827 in our text (we require DTLS-SRTP, we require TLS on SIP, ICE, etc). [BA] The gateway model explanation can help here because that would make clear that you have WebRTC clients that need to conform to RFC 8827 and “RUE” clients that aren’t based on WebRTC, so that RFC 8827 isn’t relevant. > As I said, I wasn’t there, and I don’t know what the practical problem of sending T.140 in RTP were discussed. I know we’re implemented it in mobile networks, in emergency service networks and in Video Relay services. It’s required (regulatory) in some jurisdictions. We can’t make that go away. It’s deployed all over and is being used to replace TTY. [BA] Understood. Is there any practical experience with WebRTC/RTT gateways yet? I’m particularly curious about multiparty RTT implementations. -- last-call mailing list last-call@xxxxxxxx https://www.ietf.org/mailman/listinfo/last-call