On Tuesday 01 April 2008 02:47:56 Jean Tourrilhes wrote: > > do you think it would be feasible > > to require the drivers to normalize their RSSI to a range of 0-100, > > so we would have at least some consistency between devices? (of > > course their steps within this percentage would be different and it > > would still be impossible to compare these values across different > > devices). > > If the measurement is not linear or log, it does not make > sense mandating the 0-100, because 50 won't be the mid-point. And we > assume that devices are not consistent to start with... > Anyway, to avoid quantisation errors, I prefer to defer > normalisation to the end software. For example, if the app use a 64 > pixel window to show the RSSI, it wants a value 0-63, not 0-100. ok, got it :) i will keep max_signal for this, then. it should be used for IEEE80211_HW_SIGNAL_UNSPEC and IEEE80211_HW_SIGNAL_DB. in the case of dBm i think we can always assume the same range. cheers, bruno -- To unsubscribe from this list: send the line "unsubscribe linux-wireless" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html