Hi all, A couple of years ago I worked on getting some webcams working with Linux. To be honest, I was unpleasantly surprised. There seems to be a lack of direction on how webcams are even supposed to be supported. The problem is that many USB webcams send raw 'bayer' sensor data in a compressed form to the PC in order to get reasonable frames-per-second over a full-speed (12 Mb/s) USB connection AFAIK, the linux kernel developers decided that is was no longer allowed to do decompression and bayer format conversion in the kernel. This leaves a huge gap between kernel drivers that simply stream raw or compressed bayer data (or other proprietary format) and applications that expect to received simple RGB frames from the v4l(2) device file. Many webcam drivers are consequently maintained out of the official kernel and *DO* include decompression and bayer conversion to at least get webcams into a workable state. (Examples, there are many more: * Philips PWC driver: http://www.saillard.org/linux/pwc/ * Various other drivers: http://www.linux-projects.org * Spca driver: http://mxhaard.free.fr/spca5xx.html ) Some ideas: * create a user-space library for webcams that reads the compressed data from a v4l(2) device file, decompresses and unbayers it, then exposes it through a standard API to applications. * as above, instead of library, use gstreamer plug-ins (each compressed webcam format would have its own plugin, with perhaps a single plugin for unbayering). * bypass the kernel drivers entirely and use libusb to talk to the webcams. A problem is that (AFAIK) libusb does not support isochronous USB transfers and many cameras use this transfer mode. Any thoughts on how to improve Linux webcam support? Kind regards, Bertrik