Re: Playing media into bridge via HTTP

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Matthew Jordan wrote:

Kia ora,



There's actually two different use cases in here, both of which are
worth pursuing/discussion.

The first is being able to specify a remote resource to play to a
channel/bridge via a URI, i.e.,

POST /channels/1234/play?media=http:%2F%2Fmyawesomeserver%2Fmonkeys.wav

You would cURL that down to the local filesystem and play it back. As
Ben mentioned, ideally you would also cache the result so that a
subsequent request to that URL simply plays the media from the local
file system. This use case, by itself, would be an excellent addition to
ARI/Asterisk.

Indeed. I think using the existing URI scheme implementations we have but falling back to bucket would allow this to easily be plugged in, with a bucket http/https scheme implementation. That way further down the road we could add more bucket implementations and not have to touch ARI. We could also migrate the current implementations over in time.

The second is a bit more complex: you have a remote system that is
constantly streaming media and you want to pipe that media to an ARI
resource. Since this is a constant stream of media with no well defined
beginning/end, this is a bit more involved then simply pulling a file
down and playing it. This could be an RTP stream, but it could also be
something else - although specifying it as an RTP stream is a good
starting point.

This use case is a bit more complex: the /play operation essentially
doesn't have a well defined 'file' to stream to a resource. What's more,
you can't rewind/fastforward/restart a stream. Let's say we try to view
this as an operation similar to MoH, where you have the ability to
start/stop the media to the channel but it is essentially just an
endless stream of media. In that case, we could extend the MoH operation
to specify a remote stream to play as opposed to a local stream. If it's
an RTP stream, we'd have to specify how you want to receive that stream
- which means under the hood, you're probably creating something similar
to a channel. This might look something like:

POST
/channels/1234/moh?mohClass=remote&format=ulaw&src=0%2E0%2E0%2E0%3A10000

That is, we expect a media stream to occur in format ulaw and we're
going to attempt to read it on port 10000 (binding to all addresses).
There's a lot of implementation details to make something like this work
- we have to have something reading from that address, turning it into
frames, then distributing that to all channels that want that stream (as
you wouldn't want this to be tied to a single channel that makes the moh
request to that class/address - you'd want to be able to share the
stream). That implies some form of bridge to distribute the media to all
channels that want the media. Implicit behaviour is one thing I'd like
to avoid.

Another way to view this - and the way I'd prefer to view it - is that
we're really creating another path of communication from Asterisk to
some remote media stream. Yes, that path of communication is potentially
one-way - but it is a constant, never-ending stream of media coming from
an external source, and that is really the point of a channel in
Asterisk. This use case thus feels like it is better served by a
dedicated channel of some sort - special purpose in the same way that a
snoop channel is special purpose. The standard way of creating channels
can be used to create such a media stream channel:

POST
/channels/1234?endpoint=RTPStream%2F0%2E0%2E0%2E0%3A10000&format=ulaw&app=myStasisApp

Once you have the remote streaming channel, you can do whatever you want
with it. Make it the announcer in a holding bridge. Create a snoop
channel, put both channels in a bridge together, and whisper music to a
channel. Put the stream in a mixing bridge with however many channels
you want. Basically, it's up to you to do what you want with it. And
there's nothing that says that this has to be an RTP Stream - you could
really use ARI as the signalling mechanism to set up an RTP stream with
anything - but going down that road, you will eventually have to handle
a full SDP in the JSON body, which is quite a chunk of work. I'd punt
that down the road for now, as that requires a bit more thought.

The second option feels more flexible/powerful, and it fits in with the
model of bridges/channels that Asterisk/ARI uses.

Agreed! All the APIs exist to make this possible and it could be done as a channel driver for use elsewhere. A unidirectional version of the MulticastRTP channel driver basically.

Cheers,

--
Joshua Colp
Digium, Inc. | Senior Software Developer
445 Jan Davis Drive NW - Huntsville, AL 35806 - US
Check us out at: www.digium.com & www.asterisk.org

_______________________________________________
asterisk-app-dev mailing list
asterisk-app-dev@xxxxxxxxxxxxxxxx
http://lists.digium.com/cgi-bin/mailman/listinfo/asterisk-app-dev




[Index of Archives]     [Asterisk SS7]     [Asterisk Announcements]     [Asterisk Users]     [PJ SIP]     [Linux ARM Kernel]     [Linux ARM]     [Linux Omap]     [Fedora ARM]     [IETF Annouce]     [Security]     [Bugtraq]     [Linux]     [Linux OMAP]     [Linux MIPS]     [ECOS]     [Linux API]

  Powered by Linux