shared sound device

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi,
I cross compile pjsip for arm board. I have sgtl5000 audio codec as sound
card. My project need to use the audio output channels independently.
I want:
- Play music to left channel (speaker)
- Use right channel to sip call (pjsip).

This is my asound.conf

pcm.dshare {
    type dmix
    ipc_key 2048
    slave {
        pcm "hw:0"
       rate 44100
    }
    bindings {
        0 0
        1 1
    }
}
pcm.leftx {
    type route
    slave {
        pcm "dshare"
        channels 2
    }
    ttable.0.0 4
    ttable.1.0 4

}
pcm.rightx {
    type route
    slave {
        pcm "dshare"
        channels 2
    }
    ttable.0.1 4
    ttable.1.1 4
}

pcm.mixin {
    type dsnoop
    ipc_key 2049    # must be unique for all dmix plugins!!!!
    #ipc_key_add_uid yes
    slave {
        pcm "hw:0"
        #channels 2
        #period_size 1024
        #buffer_size 4096
        rate 44100
        #periods 0
        #period_time 0
    }
    bindings {
        0 0
        1 1
    }
}

I can play two audio file independently
mpg123 -a leftx a.mp3 ---> left channel
mpg123 -a rightx b.mp3 ---> right channel

I can make a call with pjsip using rightx as playback-dev and mixin as
capture-dev. But when i try play music (mpg123 -a leftx a.mp3) to the other
speaker during a call, only listen music in left channel and pjsip-call
don't play and don't capture nothing (like mute)

Pjsip doesn't show any message or error

Any idea?
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.pjsip.org/pipermail/pjsip_lists.pjsip.org/attachments/20130820/890c6157/attachment-0001.html>


[Index of Archives]     [Asterisk Users]     [Asterisk App Development]     [Linux ARM Kernel]     [Linux ARM]     [Linux Omap]     [Fedora ARM]     [IETF Annouce]     [Security]     [Bugtraq]     [Linux]     [Linux OMAP]     [Linux MIPS]     [Linux API]
  Powered by Linux