Re: [PATCH] input: Add a detailed multi-touch finger data report

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



> Some systems (e.g. Merl's Diamond Touch), give you an ID associated
> with the user (in that case, it works by knowing where you are
> sitting by capacitive coupling). In this case, it is actually where
> the person is sitting, rather than a particular person.

This is similar to what I have experienced with the bcm5974. The chip
outputs some identification based on position, but in the end, the
information needed is of the kind 'which finger is being lifted', not
'where is it lifted'.  Obtaining such tracking information requires
the additional assumption of continuous movement, which makes the
usefulness of position-based identifiers somewhat limited. I left out
some finger details from the spec for that reason.

> Another case that will be common soon is to be able to sense and
> identify markers on the surface (which can be distinguished from
> each other).  I know of at least three hardware systems able to do
> this. One of these will be in commodity hardware soon enough to
> worry about immediately.

Like putting pins on a map and being able to tell where each pin is?

> So having and ID reported with a touch is clearly needed, whether
> thumb, index finger, or some marker.

If a chip can actually classify fingers as index or thumb, it would
definitely qualify as detailed finger information. Cool.

> Whether such markers would have any user identity directly
> associated with them is less than clear, though we'll certainly
> start giving them such identity either by convention or fiat
> somewhere in the system as the events get processed.  We may also
> face co-located sensors, where two sensors are geometrically on top
> of each other (but might even report different coordinates of
> differing resolutions), but co-aligned.  I'm thinking of the Dell
> Latitude XT in this case, though I don't yet know enough about it to
> know if in fact its pen uses a different sensor than the capacitive
> multi-touch screen.

This sounds similar to the finger classification, although here
distinguishing a pen from a finger.

Looking at these three cases, it seems adding something like
ABS_MT_TOOL_TYPE to the protocol right away makes sense. The wording
here is chosen with the distinction between (pin1, pin2, index, thumb,
pen) and (pointing-finger, clicking-finger) in mind.

> Another question is whether an ellipse models a touch adequately at
> the moment; other sensors may report more complex geometric
> information.  There is a slippery slope here, of course.  In the
> extreme case noted above, research systems give you a full image,
> which seems like overkill.  I also note the current input system
> does not provide any mechanism or hint to associate an input device
> with a particular frame buffer or with each other.  Maybe it should,
> maybe it shouldn't... Opinions?  Hope this helps.  The problem here
> is to draw a line *before* we win our complexity merit badge, while
> leaving things open to be extended as more instances of real
> hardware appears and we have more experience.

Right. :-) I believe the ellipse model is adequate, because it is the
simplest model that allows for utilization of the orientation of a
single finger, for instance to turn a knob. At this point, it seems
like tough enough a challenge.

--
To unsubscribe from this list: send the line "unsubscribe linux-input" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Media Devel]     [Linux USB Devel]     [Video for Linux]     [Linux Audio Users]     [Yosemite News]     [Linux Kernel]     [Linux SCSI]     [Linux Wireless Networking]     [Linux Omap]

  Powered by Linux