How to use DSP in gstreamer

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]


Hi,

Zhao Liang-E3423C wrote:
> Zhao Liang-E3423C wrote:
>   
>> Hi all,
>>  
>> On embedded device, DSP is used widely, many DSPs have different 
>> features, for example:
>>  
>> 1. decoder
>> DSP is just a hardware decoder
>> 2. decoder + sink
>> DSP is a decoder plus sink, it can directly accept encoded data, and 
>> then decode it and render pcm data into audio device directly.
>>     
>
> creating a sink with right caps should be enough for this one...
>
> Zhao Liang:  By my experience, it is not simple like just adding a new

> caps.
> 		Just  a example, how to handle preroll? Generally, dsp
need to do 
> initialization when start to work, so what is the time to do this?
> 		another issue is seek, how does DSP handle seek?	
>
>   

>this is generally the demuxer/parser's job to handle seek, do you have
examples where it is up to the decoder to do it ?

As it is a sink for DSP although it includes decoder functions, sink
would handle preroll, in current preroll mechanism, it only allows sink
do once preroll virtual function, all other buffers are pushed into
queue, but some hardwares need more buffers to do initialization.

Another issue is if the DSP is audio decoder + sink, how to handle clock
line increasing? Currently, we can use gstaudiosink to do it,
gstringbuffer can handle all clock line increasing, but some DSP can not
do it if no data is fed into, so the clock calculation is really a
issue. If no mechanism, rtsp streaming application will not handle
buffer delay or lost.


>> 3. A/V sync
>> DSP can do A/V sync internally or not.
>>     
>
> then you can ask basesink not to synchronize flows.
>
> Zhao Liang: If DSP is just a video decoder and sink, how does DSP sync

> with audio sink? maybe current basesink considers more software sink 
> than hardware sink.
>
>   

>I was talking about your hardware handling A/V synchronization, not the
opposite.
>buffers that are handled by sink are timestamped (by demuxers for
>instance) and are rightly handled by basesink when synchronization has
to be done in framework, and not in HW.

I understood your meaning, A/V sync is really done by basesink, but some
DSPs also have internal buffers to store buffers synced. and these
synced buffers are encoded buffers, not raw buffers. 
There are  4 combinations:
1. DSP videosink  v.s  software audiosink
2. DSP videosink  v.s  DSP audiosink
3. software videosink  v.s  DSP audiosink
4. software videosink  v.s  software audiosink

How to sync a/v with encoded buffers? It  really puzzled me. openmax can
do it well for some cases, as it has a clock in itself.  but by my
understanding, basesink is used to sync decoded buffers.  Appreicate for
your clarifying.


--
Benoit Fouet
Purple Labs S.A.
www.purplelabs.com




[Index of Archives]     [Linux Embedded]     [Linux ARM Kernel]     [Linux for ARM]     [Linux Omap]     [Fedora ARM]     [IETF Annouce]     [Security]     [Bugtraq]     [Linux Media]     [Linux OMAP]     [Linux MIPS]     [ECOS]     [Asterisk Internet PBX]     [Linux API]

  Powered by Linux