Hello there, I was debugging some PAL issues with cx231xx, and noticed some unexpected behavior with regards to selecting PAL standards. In particular, tvtime has an option for PAL which corresponds to the underlying value "0xff". This basically selects *any* PAL standard. However, the cx231xx has code for setting up the DIF which basically says: if (standard & V4L2_STD_MN) { ... } else if ((standard == V4L2_STD_PAL_I) | (standard & V4L2_STD_PAL_D) | (standard & V4L2_STD_SECAM)) { ... } else { /* default PAL BG */ ... } As a result, if you have a PAL-B/G signal and select "PAL" in tvtime, the test passes for PAL_I/PAL_D/SECAM since that matches the bitmask. The result of course is garbage video. So here is the question: How are we expected to interpret an application asking for "PAL" in cases when the driver needs a more specific video standard? I can obviously add code to tvtime in the long term to have the user provide a more specific standard instead of "PAL", but since it is supported in the V4L2 spec, I would like to understand what the expected behavior should be in drivers. Devin -- Devin J. Heitmueller - Kernel Labs http://www.kernellabs.com -- To unsubscribe from this list: send the line "unsubscribe linux-media" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html