Hi List,
I'm new to both gstreamer and Raspberry Pi, so please bear with me. Here are a few simple questions you can probably answer from the top of your head, and thanks in advance!
My config is Raspberry Pi B+ (single core 700MHz) and PlayStation Eye (USB camera + microphone), running gstreamer-0.10.
Basically I ran 3 tests with the video capture with the following command to compare CPU usage:
$ watch -d -n 2 ps -C gst-launch-0.10 -o %cpu,%mem,cmd
Test 1 (converting to RGB and sample at 30 fps):
$ gst-launch-0.10 v4l2src ! autovideoconvert ! videorate ! video/x-raw-rgb, format=RGB3, depth=24, framerate=30/1, width=320, height=240 ! autovideoconvert ! fakesink
CPU was at 25%.
Then I wondered whether I can improve over this 25%, one nature thought was to lower the sample rate from 30/1 to 5/1:
Test 2 (sampling at 5/1 instead of 30/1):
$ gst-launch-0.10 v4l2src ! autovideoconvert ! videorate ! video/x-raw-rgb, format=RGB3, depth=24, framerate=5/1, width=320, height=240 ! autovideoconvert ! fakesink
CPU usage jumped to 70%!!
Note that this camera is able to do 120 fps at 320x240 (according to Wikipedia), or 60 fps at 320x240 according to v4l2-ctl output (appended below), so I ran test 3:
Test 3 (sampling at 60/1 instead of 30/1):
$ gst-launch-0.10 v4l2src ! autovideoconvert ! videorate ! video/x-raw-rgb, format=RGB, depth=24, framerate=60/1, width=320, height=240 ! autovideoconvert ! fakesink
CPU was around 50%.
Then here are 3 questions:
1. Could someone explain to me why this CPU usage:
5fps: 70%
30fps: 25%
60fps: 50%
2. My application needed RGB, that's why I was converting to it (x-raw-rgb). To do computation on the appsink end, currently I do this: still sample at 30 fps (because CPU is lowest at 25%), but in my application for every 6 frames I drop 5, thus I'm in fact processing at a speed of 5 fps. This leaves enough CPU power to my appsink. Does this make sense?
3. Is there any other way to get lower than 25% CPU?
4. This deals with audio, that has been bugging me a bit. If I print the duration of each audio buffer, like the following in python:
print appsink.emit('pull-buffer').duration/1000000.0
For "alsasrc" I get a constant 10ms, but for "pulsesrc" I've seen 26ms, 36ms, and 46ms in different runs! Any idea?
Thank you!
Xuchen
P.S.,
The camera looks like this:
$ v4l2-ctl --all
Driver Info (not using libv4l2):
Driver name : ov534
Card type : USB Camera-B4.09.24.1
Bus info : usb-bcm2708_usb-1.2
Driver version: 3.12.22
Capabilities : 0x85000001
Video Capture
Read/Write
Streaming
Device Capabilities
Device Caps : 0x05000001
Video Capture
Read/Write
Streaming
Priority: 2
Video input : 0 (ov534: ok)
Format Video Capture:
Width/Height : 640/480
Pixel Format : 'YUYV'
Field : None
Bytes per Line: 1280
Size Image : 614400
Colorspace : SRGB
Streaming Parameters Video Capture:
Capabilities : timeperframe
Frames per second: 60.000 (60/1)
Read buffers : 2
User Controls
brightness (int) : min=0 max=255 step=1 default=0 value=0 flags=slider
contrast (int) : min=0 max=255 step=1 default=32 value=32 flags=slider
saturation (int) : min=0 max=255 step=1 default=64 value=64 flags=slider
hue (int) : min=-90 max=90 step=1 default=0 value=0 flags=slider
white_balance_automatic (bool) : default=1 value=1
exposure (int) : min=0 max=255 step=1 default=120 value=253 flags=inactive, volatile
gain_automatic (bool) : default=1 value=1 flags=update
gain (int) : min=0 max=63 step=1 default=20 value=255 flags=inactive, volatile
horizontal_flip (bool) : default=0 value=0
vertical_flip (bool) : default=0 value=0
power_line_frequency (menu) : min=0 max=1 default=0 value=0
sharpness (int) : min=0 max=63 step=1 default=0 value=0 flags=slider
Camera Controls
auto_exposure (menu) : min=0 max=1 default=0 value=0 flags=update
_______________________________________________ gstreamer-embedded mailing list gstreamer-embedded@xxxxxxxxxxxxxxxxxxxxx http://lists.freedesktop.org/mailman/listinfo/gstreamer-embedded