I send this to both linux-dvb and myth-dev lists because SNR is generated by a linux-dvb driver and displayed by the mythtv application while doing a tuning scan. I have searched the DVB documentation for the units of signal-to-noise ratio and found only that is is an int16_t. Not much help. SNR is usually specified in db and defined as: SNR = 20 * log10(signal / noise) The spec for my hardware is similar, but different: SNR = 10 * log10(14400 / [value read from dmod chip]) All this makes sense and produces numbers in the range of +43db to -13db. for my demod chip. Thus my driver returns values in db that easily fit in an int16_t. The value is positive when the signal is greater that the noise. A value of 0 means that the signal is equal to the noise. The value is negative when the noise is higher than the signal. mythtv reports the SNR as a percent value. Percent of what? I see no way to report the max and min SNR from the driver so that a percentage can be calculated. mythtv displays near 0% for a good SNR of say 18db. It displays 100% for a negative SNR of say -13db. Since the data type is int16_t why is a negative value displayed as greater than a positive one? -- Mac