I'm at a loss.
I'm trying to set up a very simple point-to-point demo application using PJSUA2.
Both the client and server programs do certain processing to initialise PJSIp, specifically: endpoint instantiation endpoint libCreate set log level endpoint libInit set null audio device transport configure on port 5060/5070 (server/client) endpoint libStart create account config with uri sip:127.0.0.1:5060/5070 modify account config, set timerMinSESec to 90 modify account config, set timerSessExpiresSec to 90 account instantiation account create with configuration
Then the server simply waits for an incoming call in the onIncomingCall callback and creates a call object which the main thread is waiting on. This seems to work because the server successfully picks this up when the client makes a call and the state moves eventually to 5 (connected) on both sides. The main thread then simply waits for the state to move to 6 (disconnected) before exiting.
The client, meanwhile, makes the call to the server by instantiating a call object to the URI of the server. As stated, this bit works because the server and client both eventually move to state 5. It's meant to play an audio file of the disk for ten seconds then disconnect. But here's the issue.
In the onCallMediaState callback, the client creates an audiomediaplayer as follows: AudioMedia &playMedia = ep.audDevManager().getPlaybackDevMedia(); amp.createPlayer("./input.wav", PJMEDIA_FILE_NO_LOOP); amp.startTransmit(playMedia);
The server basically does the same thing but with an audiomediarecorder: CallInfo call_info = getInfo(); AudioMedia* audio_media = 0; for (unsigned int i = 0; i < call_info.media.size(); ++i) { if (call_info.media[i].type == PJMEDIA_TYPE_AUDIO) { audio_media = static_cast<AudioMedia*>(getMedia(i)); break; } } if (audio_media != 0) { amr.createRecorder("output.wav"); audio_media->startTransmit(amr); } }
Keep in mind that the endpoint (ep), player (amp) and recorder (amr) are properly scoped, they're created well before this point as (effectively) global variables.
Now what I'm seeing is that the output.wav file is being created okay and VLC can play it, but it consists of about ten seconds of absolute silence (and the majority of bytes in the file are zero, the remaining ones probably being housekeeping information for the container - it starts with the RIFF and WAVEfmt strings so I know it's a valid WAV file).
So somewhere along the way, the audio is not being transmitted from client to server and I have no idea why. I appear to have followed all the rules in setting up the requisite objects but it may be that I'm missing something.
Does anyone have any ideas what could be going wrong, or does anyone know where I can find sample point-to-point file based-transfer using PJSUA2? The sample apps that come with PJSIP seem rather more simplistic than that.
Cheers, Al.
Allan Chandler | Software Engineer DTI Group Ltd | Transit Security & Surveillance 31 Affleck Road, Perth Airport, WA 6105, AU P | F | allan.chandler@xxxxxxxxxx
Visit our website www.dti.com.au The information contained in this email is confidential. If you receive this email in error, please inform DTI Group Ltd via the above contact details. If you are not the intended recipient, you may not use or disclose the information contained in this email or attachments. |
_______________________________________________ Visit our blog: http://blog.pjsip.org pjsip mailing list pjsip@xxxxxxxxxxxxxxx http://lists.pjsip.org/mailman/listinfo/pjsip_lists.pjsip.org