fader mapping

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 




Fader mapping in general has been tied to the phyisical fader and it's length. 4in/100mm seems to be the max real estate people want to use. Ok so far. The mapping is set up to use that 4in in the best way possible, but seems to be the same no matter what the use. There are three main uses I can see:
	Broadcast - Fades to oo(off) happen often
	Live Mix  - Fades might happen once in a performance (maybe 2 or 3)
	recording - Fades are more often done with some kind of automation
			maybe never on a physical fader... or maybe only
			on bus faders.

Assuming a recording/mixdown situation, would it make sense on a limitless fader to continue at the same db drop per movement from +6(or 10 or whatever) to -110?

My reasoning is this:
By recording, I mean DAW and so 32bit float derived from 24bitADC. Tracks are therefor recorded with more headroom and less compression as these can be dealt with later. DAWs do not seem to think in terms of a channel strip trim at input so that each track can put the fader in it's most acurate possition (right around 0db) and something loud that is really background may end up with it's fader position quite low. This would mean minor adjustment to that track would be difficult. Would it make sense to be able to move the range of a fader (physical or other wise) so that it goes from -10 to oo rather than from +10 to oo? Think put one finger on a modifier key and then move the fader from where the signal is to where on the fader strip you want it then release modifier key. SO if the fader is at -20 and this was done, -20 would now be at the 0db possition.

Or if using a mouse wheel, the same amount of wheel clicks would move the same db at any place on the fader... or a new type of fader might make this possible too.

The thing is, a fader is no longer an audio pad that can be adjusted, it is a data input device and as such it just has a linear position output. There is no reason that data and it's meaning can't change on the fly as needed. For most mixing (even live) the fader input is either "I need more of that" or "I need less of that". In such a case, the sound the engineer hears is what they go by, not the fader position. If the fader position has to be looked at to change it takes the engineer's mind away from the mix momentarily rather than if the operation position is always the same. I am thinking what would work for a blind person, and wouldn't that be better even for someone with sight?

With the talk about the X-air etc. not too long ago, I have played around with the remote (PC and Adroid) mixer apps and the faders work in such a way that touching (clicking) on any part of the fader and moving will take the current value and mov it in the direction the finger/mouse is moved (Ardour3/4 and maybe 2) are the same. In my opinion this is the right way. Most touch surfaces either require the finger to touch where the control is right now before it will move (second best) or will jump to where the finger hits... bad or the worst ones are ones where the fader does not move till your finger gets to the control, but moves as soon as the finger touches any part of the control which means that a finger moving from bottom to top has the fader position jump down so the middle of the fader control is at the finger as soon as the finger gets to the bottom of the fader control and then start moving up. (just plain broken)

Actually, fading to oo would still be possible on the solutions I have mentioned so far. Seems win/win. Another softkey could be used for reset modifier.

Another solution might be to use a modifier key to make the fader set a channel trim. This could be a good solution too or even as well. I would think once someone has set up their channel strip with things like compression (or anything level dependant) they would not want change the input level though.

I am working on a control surface where the brain is a small Linux box. I have been looking at different well known protocols like MCP and HUI as well as what some of the digital mixers use (the ones that are open about it). I am thinking about how I would want to control a DAW if I have control of what I am doing.

I am interesting in ways that a remote application either via MIDI or OSC can find out what plugins a channels has in it and therefore what are it's controls I can play with. I would like to have an Android app that shows the current channel (as selected by the control surface) strip plugins as a set of tabbed pages with each tab showing a control GUI for that plugin. (wifi based of course) The way Allen & Heath have done it for their digital mixers is nice, but for a DAW where each channel may be different and there may be more than 4 plugins, it is probably not the right solution... the GLD is close though.

--
Len Ovens
www.ovenwerks.net

_______________________________________________
Linux-audio-user mailing list
Linux-audio-user@xxxxxxxxxxxxxxxxxxxx
http://lists.linuxaudio.org/listinfo/linux-audio-user




[Index of Archives]     [Linux Sound]     [ALSA Users]     [Pulse Audio]     [ALSA Devel]     [Sox Users]     [Linux Media]     [Kernel]     [Photo Sharing]     [Gimp]     [Yosemite News]     [Linux Media]

  Powered by Linux