What's the point of that? - Was: rtirq

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Paul, you don't refer to my claims.

I never claimed something about the MIDI transmission speed and/or
samples.

Learn to read!

I didn't claim anything about syncing Audio workstations, I explained
the accuracy of old MIDI workstations synced to tape.

However, that was nothing I wanted to discussed, my original request
was related to the rtirq configuration.

Resume:

There are no claims about MIDI speed, or samples done by me! Could you
please quote what I have written?

Stop spreading this nonsense about me!

Perhaps you would understand what I have written and what was written
by others, if you become aware that we usually bottom post on mailing
lists.

On Thu, 19 Mar 2015 17:12:53 -0500, Paul Davis wrote:
>As usual madorf is spouting crap.
>
>Sync-to-timecode is complex and many DAWs don't do as good a job as
>some others. The accuracy of the sync has ABSOLUTELY NOTHING to do
>with the speed of MIDI transmission, since all such synchronization
>requires a DLL/PLL anyway. Ardour can sync to an MTC or LTC source
>within 1 sample of the master, but does have the defect that for MTC
>it does not take the input latency from the master into consideration
>(this is in the process of being fixed). Problems with synchronization
>are problems with synchronization NOT problems with MIDI (in general).
>
>There are additional problems, however, with madorf's claims. Recording
>while synced to a timecode source is problematic UNLESS you know for
>sure that the timecode source and the audio clock being used to drive
>the DAW share the same word clock. I know of no DAW that can
>accurately record audio if this is not the case, because the time
>periods required for varispeeding to track the master are too big for
>audio (they may even make a difference with just MIDI data). The whole
>design of synchronization MUST smooth out jitter in the incoming
>timecode signal and this necessarily means ignoring (for now) minor
>variations in the apparent speed of the master. If you tried to throw
>away or add extra samples in an attempt to precisely track the master,
>you'd just be going crazy and accomplishing nothing.
>
>
>On Thu, Mar 19, 2015 at 3:37 PM, Len Ovens <len@xxxxxxxxxxxxx> wrote:
>
>> On Thu, 19 Mar 2015, Ralf Mardorf wrote:
>>
>>  On Thu, 19 Mar 2015 09:07:34 -0700 (PDT), Len Ovens wrote:
>>>
>>>> So I have to ask myself if what you are hearing is just the
>>>> effects of a slow standard MIDI transport before the info even
>>>> gets to the computer.
>>>>
>>>
>>> You can do an experiment, assumed you still own old computers and
>>> tapes.
>>>
>>> Tape synced to the Computer by SMPTE (Atari) or by Click (C64).
>>> Record a MIDI synth and after that record the same synth on another
>>> tape track. This will double the synth sound, all you get is a
>>> phasing, that doesn't move.
>>>
>>> Do the same with a Linux or Windows PC. Record a track with
>>> Qtractor or Cubase and after that record the same external synth on
>>> another Qtractor or Cubase track. Sounds do not start at the same
>>> time, there's always audible shift, comparable to slow early
>>> reflections and the phasing is moving.
>>>
>>
>> That is the first explanation of this that makes sense to me. Thank
>> you. I do not know if it is possible to fix this in Linux or at
>> least in the sequencing SW we have. In a machine that only deals
>> with MIDI (Atari or C64), each midi event has it's own time stamp or
>> possition based on the OS clock (whatever the sequencing program is
>> using for a time base). In a machine that deals with audio, that
>> time base is an audio buffer length which may contain more than one
>> midi event, but may not contain all midi events that are meant to be
>> together. Not only that, but when the same midi goes in a cycle,
>> there is no guarentee that the events that were within one buffer
>> length with again remain within the same buffer length and this may
>> go in and out of sync as midi and it's time signature may form a
>> beat with the audio media clock. This would be the moving phase you
>> hear.
>>
>> Perhaps setting jack up for 16/3 at 48k would solve that. 16 samples
>> seems to be close to one MIDI byte and most events are three
>> bytes... though with a chord running status would take 9 bytes and
>> make seven or even possibly 6 (I don't remember if active sensing
>> resets running status). In any case each midi byte should be aligned
>> with the number of samples that best fits that one byte. I don't
>> know is 16/2 would be better or not (read I don't wish to spend the
>> brain power thinking about it).
>>
>>  It's not a limitation of MIDI.
>>>
>>
>> It is some of both. A faster MIDI would not solve things unless Linux
>> audio was done differently. Assuming time stamping had to be done at
>> each byte would fix this. Using media clock still seems like the
>> right thing to do because in general, the media clock goes with a
>> project from computer to computer. Basically, you are telling me
>> that an audio buffer of 128 samples is too long for good MIDI sync.
>> This can be fixed in two ways: Use a short audio buffer or decouple
>> midi from the audio buffer completely and run midi processing and
>> time stamping separately. The first can be done by anyone who has a
>> machine that can run at 16/2 or 16/3 (maybe even 32/2 would be ok)
>> xrun free. The second would require redoing the sequencer SW and
>> possibly the ALSA midi drivers (I don't know enough about them to
>> say).
>>
>>  However, I wont discuss this again, I just want to know if RTC is
>>> needed in the rtirq config for audio (ALSA/jackd), assumed jackd
>>> doens't start with "--clocksource OR -c h(pet)". And should rtirq
>>> config include an entry for HRTIMER/HPET? Is it possible to add
>>> HRTIMER/HPET to the rtirq config (e.g. in addition to RTC) and what
>>> is the name of such an entry?
>>>
>>
>> I can't answer if RTC is needed. HRTIMER is probably snd_hrtimer,
>> but I don't know if elevating the priority of that alone would help
>> because of all we discused above. Also the priority of
>> snd_mpu401_uart and the snd_seq* group of modules may suffer as
>> well... But none of that matters if the midi read/write
>> clock/timestamp is related to the audio buffer which is how any jack
>> connected application would do things. Jack does allow a midi port
>> to set which sample a midi event belongs to, but does that timing
>> make it past jack? Do sequencers use this possibility or just send
>> all the midi stuff right now for each graph knowing there are lots
>> of delays in there anyway?
>>
>> Maybe those fussy people who decided aes67 should support at least
>> 1ms latency are right.
>>
>>
>> --
>> Len Ovens
>> www.ovenwerks.net
>>
>> _______________________________________________
>> Linux-audio-user mailing list
>> Linux-audio-user@xxxxxxxxxxxxxxxxxxxx
>> http://lists.linuxaudio.org/listinfo/linux-audio-user
>>

_______________________________________________
Linux-audio-user mailing list
Linux-audio-user@xxxxxxxxxxxxxxxxxxxx
http://lists.linuxaudio.org/listinfo/linux-audio-user




[Index of Archives]     [Linux Sound]     [ALSA Users]     [Pulse Audio]     [ALSA Devel]     [Sox Users]     [Linux Media]     [Kernel]     [Photo Sharing]     [Gimp]     [Yosemite News]     [Linux Media]

  Powered by Linux