Re: Playing media into bridge via HTTP

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



This is very much functionality that should be in Asterisk, not only via ARI, and which is present in pretty much every other IVR platform (see FS' http://wiki.freeswitch.org/wiki/Mod_http_cache). There are obvious caching issues to be considered, but I'd love to see this present and accessible via all interfaces.


On 4 April 2014 10:15, Ben Merrills <b.merrills@xxxxxxxxxxxxxxxx> wrote:
Hi All/Dan,

I have brought this up myself once or twice and having the ability to play audio from a remote source (an additional schema in play) would be of great benefit. Here are some general reasons I believe this is important.

1. It allows quite integration of TTS into ARI without having to wait for the Generic Speech API to be adopted. Having the ability to play a remote wav would allow a TTS engine (say a free one like http://tts-api.com or a local instance that can generate a wav or mp3) to play audio directly into the call. As ARI has no Exec or other TTS integration as yet, this would help bridge the gap.

2. It allows for remote audio to be played and as a Stasis Application has not requirement to be hosted either on or locally to the Asterisk instance, getting audio files to a server is a problem. This either has be to done in advance (copied) or already exist there. Now, I did mention a while back having a set of ARI features to allow upload and download of files would be very useful, again, having a remote audio schema for Play helps to bridge this gap once more.

That's my thoughts. Hope they're useful from another person actively using ARI to write applications :)


-- Dan Wrote:
Hello All,

Was talking to a few people yesterday about how I'd like to be able to play media from an external source into Asterisk using the ARI - say when adding someone to a bridge - prompts/hold music come from an external source.

I know Paul B talked about this a month or so ago - http://lists.digium.com/pipermail/asterisk-app-dev/2014-March/000408.html

That ended up being a technical discussion about local channels etc; I'd like to get to a point where we find out if this is something people want so the Asterisk Team can decide if it's something worth putting into their timescales - hence the new topic - sorry if anyone disagrees!

For me, external source = HTTP(S)

A really basic example of what I mean... the absolute radio mp3 stream (they have many other stream codec types - FLAC being the best they give)
http://network.absoluteradio.co.uk/core/audio/mp3/live.pls?service=vrbb
Obviously there may be licensing issues with *that* stream but you get where I'm coming from - and issues when it comes to different codecs.
Going a little further, it may not be a stream at all and may just be a file over http. Essentially I wouldn't want to have to put files onto the same filesystem as Asterisk - my ARI application may live outside of Asterisk itself.
What do people think about this?
Dan

_______________________________________________
asterisk-app-dev mailing list
asterisk-app-dev@xxxxxxxxxxxxxxxx
http://lists.digium.com/cgi-bin/mailman/listinfo/asterisk-app-dev

_______________________________________________
asterisk-app-dev mailing list
asterisk-app-dev@xxxxxxxxxxxxxxxx
http://lists.digium.com/cgi-bin/mailman/listinfo/asterisk-app-dev

[Index of Archives]     [Asterisk SS7]     [Asterisk Announcements]     [Asterisk Users]     [PJ SIP]     [Linux ARM Kernel]     [Linux ARM]     [Linux Omap]     [Fedora ARM]     [IETF Annouce]     [Security]     [Bugtraq]     [Linux]     [Linux OMAP]     [Linux MIPS]     [ECOS]     [Linux API]

  Powered by Linux