Re: Playing media into bridge via HTTP

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 






On Fri, Apr 11, 2014 at 3:38 PM, Matthew Jordan <mjordan@xxxxxxxxxx> wrote:


On Fri, Apr 4, 2014 at 10:00 AM, Ben Langfeld <ben@xxxxxxxxxxx> wrote:
This is very much functionality that should be in Asterisk, not only via ARI, and which is present in pretty much every other IVR platform (see FS' http://wiki.freeswitch.org/wiki/Mod_http_cache). There are obvious caching issues to be considered, but I'd love to see this present and accessible via all interfaces.

On 4 April 2014 10:15, Ben Merrills <b.merrills@xxxxxxxxxxxxxxxx> wrote:
Hi All/Dan,

I have brought this up myself once or twice and having the ability to play audio from a remote source (an additional schema in play) would be of great benefit. Here are some general reasons I believe this is important.

1. It allows quite integration of TTS into ARI without having to wait for the Generic Speech API to be adopted. Having the ability to play a remote wav would allow a TTS engine (say a free one like http://tts-api.com or a local instance that can generate a wav or mp3) to play audio directly into the call. As ARI has no Exec or other TTS integration as yet, this would help bridge the gap.

2. It allows for remote audio to be played and as a Stasis Application has not requirement to be hosted either on or locally to the Asterisk instance, getting audio files to a server is a problem. This either has be to done in advance (copied) or already exist there. Now, I did mention a while back having a set of ARI features to allow upload and download of files would be very useful, again, having a remote audio schema for Play helps to bridge this gap once more.

That's my thoughts. Hope they're useful from another person actively using ARI to write applications :)


-- Dan Wrote:
Hello All,

Was talking to a few people yesterday about how I'd like to be able to play media from an external source into Asterisk using the ARI - say when adding someone to a bridge - prompts/hold music come from an external source.

I know Paul B talked about this a month or so ago - http://lists.digium.com/pipermail/asterisk-app-dev/2014-March/000408.html

That ended up being a technical discussion about local channels etc; I'd like to get to a point where we find out if this is something people want so the Asterisk Team can decide if it's something worth putting into their timescales - hence the new topic - sorry if anyone disagrees!

For me, external source = HTTP(S)

A really basic example of what I mean... the absolute radio mp3 stream (they have many other stream codec types - FLAC being the best they give)
http://network.absoluteradio.co.uk/core/audio/mp3/live.pls?service=vrbb
Obviously there may be licensing issues with *that* stream but you get where I'm coming from - and issues when it comes to different codecs.
Going a little further, it may not be a stream at all and may just be a file over http. Essentially I wouldn't want to have to put files onto the same filesystem as Asterisk - my ARI application may live outside of Asterisk itself.
What do people think about this?
Dan



There's actually two different use cases in here, both of which are worth pursuing/discussion.

The first is being able to specify a remote resource to play to a channel/bridge via a URI, i.e.,

POST /channels/1234/play?media=http:%2F%2Fmyawesomeserver%2Fmonkeys.wav

You would cURL that down to the local filesystem and play it back. As Ben mentioned, ideally you would also cache the result so that a subsequent request to that URL simply plays the media from the local file system. This use case, by itself, would be an excellent addition to ARI/Asterisk.

The second is a bit more complex: you have a remote system that is constantly streaming media and you want to pipe that media to an ARI resource. Since this is a constant stream of media with no well defined beginning/end, this is a bit more involved then simply pulling a file down and playing it. This could be an RTP stream, but it could also be something else - although specifying it as an RTP stream is a good starting point.

This use case is a bit more complex: the /play operation essentially doesn't have a well defined 'file' to stream to a resource. What's more, you can't rewind/fastforward/restart a stream. Let's say we try to view this as an operation similar to MoH, where you have the ability to start/stop the media to the channel but it is essentially just an endless stream of media. In that case, we could extend the MoH operation to specify a remote stream to play as opposed to a local stream. If it's an RTP stream, we'd have to specify how you want to receive that stream - which means under the hood, you're probably creating something similar to a channel. This might look something like:

POST /channels/1234/moh?mohClass=remote&format=ulaw&src="">
That is, we expect a media stream to occur in format ulaw and we're going to attempt to read it on port 10000 (binding to all addresses). There's a lot of implementation details to make something like this work - we have to have something reading from that address, turning it into frames, then distributing that to all channels that want that stream (as you wouldn't want this to be tied to a single channel that makes the moh request to that class/address - you'd want to be able to share the stream). That implies some form of bridge to distribute the media to all channels that want the media. Implicit behaviour is one thing I'd like to avoid.

Another way to view this - and the way I'd prefer to view it - is that we're really creating another path of communication from Asterisk to some remote media stream. Yes, that path of communication is potentially one-way - but it is a constant, never-ending stream of media coming from an external source, and that is really the point of a channel in Asterisk. This use case thus feels like it is better served by a dedicated channel of some sort - special purpose in the same way that a snoop channel is special purpose. The standard way of creating channels can be used to create such a media stream channel:

POST /channels/1234?endpoint=RTPStream%2F0%2E0%2E0%2E0%3A10000&format=ulaw&app=myStasisApp

Once you have the remote streaming channel, you can do whatever you want with it. Make it the announcer in a holding bridge. Create a snoop channel, put both channels in a bridge together, and whisper music to a channel. Put the stream in a mixing bridge with however many channels you want. Basically, it's up to you to do what you want with it. And there's nothing that says that this has to be an RTP Stream - you could really use ARI as the signalling mechanism to set up an RTP stream with anything - but going down that road, you will eventually have to handle a full SDP in the JSON body, which is quite a chunk of work. I'd punt that down the road for now, as that requires a bit more thought.

The second option feels more flexible/powerful, and it fits in with the model of bridges/channels that Asterisk/ARI uses.

Thoughts?


--
Matthew Jordan
Digium, Inc. | Engineering Manager
445 Jan Davis Drive NW - Huntsville, AL 35806 - USA

_______________________________________________
asterisk-app-dev mailing list
asterisk-app-dev@xxxxxxxxxxxxxxxx
http://lists.digium.com/cgi-bin/mailman/listinfo/asterisk-app-dev



It seems sensible to me - I think that covers everything that I would want as an end user for the time being
_______________________________________________
asterisk-app-dev mailing list
asterisk-app-dev@xxxxxxxxxxxxxxxx
http://lists.digium.com/cgi-bin/mailman/listinfo/asterisk-app-dev

[Index of Archives]     [Asterisk SS7]     [Asterisk Announcements]     [Asterisk Users]     [PJ SIP]     [Linux ARM Kernel]     [Linux ARM]     [Linux Omap]     [Fedora ARM]     [IETF Annouce]     [Security]     [Bugtraq]     [Linux]     [Linux OMAP]     [Linux MIPS]     [ECOS]     [Linux API]

  Powered by Linux