On Sat, Mar 3, 2012 at 7:59 AM, Chris Bagwell <chris@xxxxxxxxxxxxxx> wrote: > On Fri, Mar 2, 2012 at 6:21 PM, Dmitry Torokhov > <dmitry.torokhov@xxxxxxxxx> wrote: >> Hi Jason, >> >> On Fri, Mar 02, 2012 at 10:36:22AM -0800, Jason Gerecke wrote: >>> The Intuos5 was just announced, and I'm working on adding support for >>> it to the wacom kernel module. One feature new to the Intuos5 is the >>> addition of capacitive sensors embedded in the buttons. Each >>> ExpressKey (button) has two bits of state: one for the capacitive >>> switch and one for the mechanical switch. These bits are synthesized >>> into three cases: released, touched, and pressed (the 4th case of >>> "pressed with a non-capacitive object" isn't particularly >>> interesting). My initial thought was to call input_event() with a >>> unique value for the 'touched' case. However, the documentation only >>> defines three values for EV_KEY: 0 (released), 1 (pressed), and 2 >>> (autorepeat). >>> >>> Is 'touched' a case we would want to allow for EV_KEY, or should I be >>> representing things differently? >> >> Differently please. It looks like you have 2 independent objects there - >> one mechanical switch (persistent on/off) and one is a key (pressed >> while touched). I think you're confused. The buttons on Inutos4/Intuos3/Cintiq21UX/etc. report when they've been pressed. The Intuos5 embeds a capacitive sensor into each button so that they can report "touched" in addition. If you ignore the capacitive sensor, each button is exactly the same as you'd find on our other tablets, a mouse, or a keyboard. It should be in a "pressed" state while sufficient force is applied, and in an "unpressed" state otherwise. It is *not* expected to behave like a persistent toggle switch. Since "touched" is a natural and distinct button state, it makes sense (to me at least) to have EV_KEY define it as an additional legal value. > > There is a little overlap here with clickpads and it would be nice if > user apps could work similar for both. The main use (I think) of this > proposed key press is to give visual feedback were users finger is at > before they press hard enough to activate button. It could be the HUD > feature of Intuos5 or the taskbar applet Windows synaptics driver has > that gives feedback how touchpad is being touched. > > For clickpads, its an X/Y value that a HUD/applet would use to give > the visual feedback. If these new "keys" are not exposed as a full > blown capacitive touch strip then you could simulate that and send an > Y=1 and X=1..4 to represent were finger is. > > The main negative with this is you'll have the headache clickpad user > land has trying to detect when X/Y is buttons and when its touchpad > movement. At least in your case, you could send a BTN_TOOL_STRIP or a > BTN_TOOL_FINGER+serial # to let user land know this is a unique area > of touch. > > Chris I'm not sure about this... I don't think there are really any clickpad user apps out there that would be relevant to the Intuos5 since its buttons aren't used like one. While it'd be possible to simulate a clickpad with the raw data, I'm having a hard time seeing the use for anything beyond a HUD (which could be written without the faux-position data anyway). Jason --- Day xee-nee-svsh duu-'ushtlh-ts'it; nuu-wee-ya' duu-xan' 'vm-nvshtlh-ts'it. Huu-chan xuu naa~-gha. -- To unsubscribe from this list: send the line "unsubscribe linux-input" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html