Bug in H264 decode buffer size

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Thu, 22 May 2014 14:58:40 -0400, Bill Gardner wrote:
> It worked for me. I've set the default at 720x576 and I'm able to 
> connect to a 4CIF endpoint at level 3.0. Note that find_highest_res 
> calculates the maximum picture size in macroblocks not by reading a 
> constant from a table (which it probably should do) but by reading the 
> max macroblocks per second and dividing it by the frame rate in the 
> codec table. So if you set the frame rate too high for the profile level 
> it would use a smaller picture size than maximum.


This creates more questions:  is this locally specified fps used for the
decoder buffer length calculations here above?   (Locally set fps has
nothing to do with the frame rate the distant endpoint is using while
encoding, which seems unknown to us at that time:  so the size of the
local buffer depending on locally set fps to whatever value, which is
aimed for decoding distant endpoint-sent stream, is this really correct?)

Though, setting the buffer size by specifying maximum frame size in
codec_desc[] in ffmpeg_vid_codecs.c is going to be ok as long as nobody
programmatically changes this; this datastructure seems open to API
access.  Why not tabulate these maximum frame sizes in level_info[] in
vid_codec_util.c as well?

    Eeri Kask





[Index of Archives]     [Asterisk Users]     [Asterisk App Development]     [Linux ARM Kernel]     [Linux ARM]     [Linux Omap]     [Fedora ARM]     [IETF Annouce]     [Security]     [Bugtraq]     [Linux]     [Linux OMAP]     [Linux MIPS]     [Linux API]
  Powered by Linux