On Sat, Aug 20, 2011 at 03:07:41PM +0800, Daniel Kurtz wrote: > On Sat, Aug 20, 2011 at 6:22 AM, Dmitry Torokhov > <dmitry.torokhov@xxxxxxxxx> wrote: > > Hi Daniel, > > > > On Thu, Aug 18, 2011 at 07:28:03PM +0800, Daniel Kurtz wrote: > >> @@ -558,6 +626,11 @@ static void synaptics_process_packet(struct psmouse *psmouse) > >> if (synaptics_parse_hw_state(psmouse->packet, priv, &hw)) > >> return; > >> > >> + if (SYN_CAP_IMAGE_SENSOR(priv->ext_cap_0c)) { > >> + synaptics_image_sensor_process(psmouse, &hw); > >> + return; > >> + } > >> + > > > > So what about the rest of the SYnaptics processing (wheel, additional > > buttons, etc)? Are we sure that touchpads with image sensors will never > > implement them? > > > > -- > > Dmitry > > > > All image sensors that I am aware of are clickpads, with one button > integrated under the pad, which is reported as the middle button. > > We could report right, middle, up, down, and ext_buttons (scroll is > not possible, since w=2;buf[1] is used for x for devices that send agm > packets). > However, I have no way of knowing if this added complexity is > necessary I would prefer us supporting full protocol so if some vendor does add additional buttons we have everything in place. > , nor any way of testing it. As long as your case still works I think it will be good enough. Thanks. -- Dmitry -- To unsubscribe from this list: send the line "unsubscribe linux-input" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html