I see that these have been changed to uint16_t's in the 4.0 API, from int16_t's in the 3.0 API. Has anyone considered specifying how they should be calculated? Or what units these should be reported in? The API should, I believe, specify whether the S/N should be reported in decibels or in absolute terms. If you want the value in decibels then a fixed point should be set. Say 11 bits, 5 bits, so that fractional decibels can be shown. If the value should be in absolute terms this should be specified in the API so that users of the API can know take the log10 of the value and multiply by 10 to report decibels. I have no idea how the signal is calculated in practice but perhaps if units were given that would be a start. This value could also be specified in absolute terms or something like dB microwatts. This should be specified in the API docs. I've written some basic signal meter support for MythTV, and personally I don't care what units the values are returned in so long as the API tells me which units I can expect them in. Then I can convert to whatever units it makes sense to present the values in and show them. For the driver writers it probably makes the most sense to report absolute values and have me and my ilk convert it to decibels in user space with the benefit of floating point. It doesn't much matter if the signal is reported as 1uw when it is actually 50uw so long as I know how many units of increase represent a doubling or halving of the value. But the DVB frontend API should specify what should be reported so that the driver writers have some idea of what to aim for and the driver users have some idea of what to expect. -- Daniel