Re: SDR sampling rate - control or IOCTL?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 21.11.2013 23:19, Antti Palosaari wrote:
On 21.11.2013 22:54, Mauro Carvalho Chehab wrote:
Em Thu, 21 Nov 2013 22:22:49 +0200
Antti Palosaari <crope@xxxxxx> escreveu:

On 21.11.2013 21:12, Mauro Carvalho Chehab wrote:
Em Thu, 21 Nov 2013 20:33:15 +0200
Antti Palosaari <crope@xxxxxx> escreveu:

On 21.11.2013 20:22, Hans Verkuil wrote:

BTW, can the sample rate change while streaming? Typically things
you set
through S_FMT can not be changed while streaming.

Yes, but in practice it is uncommon. When I reverse-engineered Mirics
MSi2500 USB ADC I did it hundred of times. Just started streaming and
injected numbers to ADC control registers, then calculated sampling
rate
from the stream.

That's not an use case. It is just a developer's procedure. Anyway, you
could still measure the bit rate like that, if you do a stream start
and
stop.

That is only use case I know currently, there still could be some
others.

Seriously? Since the Shannon theorem, all theory used on DSP assumes
that
the samples are spaced at the very same bit rate.

Nothing prevents do to it, the key issue is that
sampling rate is needed to known by app.

No, it is harder than that: if the bit rate changes, then you need
to pack
the sampling rate changes when they occur inside the stream, as
otherwise
userspace will have no means to detect such changes.

Heh, I cannot understood you. Could you explain why it works for me?
Here is video I recorded just for you:
http://palosaari.fi/linux/v4l-dvb/mirics_msi3101_sdrsharp_sampling_rate.mp4


It is Mirics MSi3101 streaming FM radio with sampling rate 2.048 Msps,
then I switch to 1.024 Msps and back few times - on the fly. IMHO
results are just as expected. Sound start cracking when DSP application
sampling rate does not match, but when you change it back to correct it
recovers.

In other words, changing the sampling rate while streaming breaks
decoding.

Of course, in a case DSP does not know what it is. I have found that
changing frequency during streaming breaks my audio as well.


If I will add button to tell app DSP that sampling rate is changed, it
will work for both cases. I haven't yet implemented that settings
button, it is hard coded to SDRSharp plugin.

Could you explain why it works if it is impossible as you said?

I can't imagine any "magic" button that will be able to discover
on what exact sample the sampling rate changed. The hardware may
have buffers; the DMA engines and the USB stack for sure have, and
also V4L. Knowing on what exact sample the sampling rate changed
would require hardware support, to properly tag the sample where the
change started to apply.

"Magic button". It is just DSP application which sends request to
hardware. And if hardware says OK, that magic SDR application says for
own DSP hey change sampling rate to mach stream.

There is huge amount of bits streaming, no need to tag. You could just
throw away second or two - does not matter. Imagine it similarly like a
UDP VoIP call - when you lose data, so what, it is 20ms of audio and
none cares.
It is similarly here, if you lose some data due to sampling rate
mismatch, so what. It is only few ms of audio (or some other). One way
radio channel is something it should be robust for such issues - you
cannot request retry.

If the hardware supports it, I don't see an reason why blocking calling
VIDIOC_S_FMT in the middle of a stream.

However, on all other hardwares, samples will be lost or will be
badly decoded, with would cause audio/video artifacts or even break
the decoding code if not properly written.

Anyway, if samples will be lost anyway, the right thing to do is to
just stop streaming, change the sampling rate and start streaming
again. This way, you'll know that all buffers received before the
changes will have the old sampling rate, and all new buffers, the new
one.

I cannot agree. It is too slow, without real benefits, for many use cases.

Also, I am pretty sure many of the hw DSP implementations will not
restart streaming when they hunt for demodulation lock. There is likely
just a long shift-register or FIFO where bits are running even different
sampling rates etc. are tested.

I did some study of runtime sampling rate changes and I am very sure it is *required*, especially for digital receivers, like DTV dedulators, where timing is important. The main reason is synchronization - not only for when channel is acquired but for run time synchronization too in order to maintain receiver sync (lock).

Here is one document which explains some reasons and solutions for digital receiver synchronization:
http://www.cs.tut.fi/kurssit/TLT-5806/Synch.pdf

You can find a lot of more information when search "Synchronization Techniques for Digital Receivers"

What goes to Mirics MSi2500 ADC, it has even flag to signal when sampling rate is changed. Due to that you will not even lose many samples and it is possible to make demodulator design simpler. It is byte 5 in USB packet header which changes between 10/90 when sampling rate is changed as I shown in earlier video. I am pretty sure that they have had good reason to add support for run time sampling rate change as it is the only software based TV demodulator solution currently.

**********
So the requirements are what I listed originally + it must be possible to change sampling rate during streaming.

Now I am testing similar solution than VIDIOC_ENUM_FREQ_BANDS

regards
Antti

--
http://palosaari.fi/
--
To unsubscribe from this list: send the line "unsubscribe linux-media" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html




[Index of Archives]     [Linux Input]     [Video for Linux]     [Gstreamer Embedded]     [Mplayer Users]     [Linux USB Devel]     [Linux Audio Users]     [Linux Kernel]     [Linux SCSI]     [Yosemite Backpacking]
  Powered by Linux