Hi, > Care to summarize here too what's currently blocking touchscreen buttons > driver only providing the keyboard events as a separate inputdevice > without having to mirror all the touchscreen events? Of course. The touchscreen buttons driver has two primary tasks: it needs to provide the input device with the keyevents of the buttons and it needs to somehow prevent these surfaces generating regular touch events. Currently the driver provides these on 2 different input devices, one with just the key events and one with all the touch events mirrored execpt those that the driver wants to filter away. It would seam best to avoid implementing the second input device by using a filter in the input layer on the original touchscreen to filter away the unwanted events. The interface is however not conducive to this as it requires the filter immidatly decide to filter an event or not. This however is impossible as a touch on a button is the combination of many simultainus events and the touchscreen buttons driver cant possibly know wather to filter an event or not until the SYN report arrives. This could be worked around by filtering away all events and then retransmitting the ones we dont want to filter later using input_inject_event. The problem with this approch however is that we cant do this from the interrupt context, and doing this later via a workeque or similar mechanisum crates a very dificult to solve problem where we must somehow disable the filter until our events are passed to userpsace (to avoid retansmitting our own injected events in an endless loop) whilst not loosing any events that might come in at any time via an intterupt. The above problem is likely solvable, somehow, but solveing it would incure another problem: Registering an input filter counts as having the device open, therby the underlying touchscreen device can not sleep, this is unacceptable on the devices where this driver is intended to be used, as they have power budgets in the tens of mW. Right now this problem is solved by the fact that we can get events from the input layer on the mirrored touchscreen devce when userspace closes and opens this device. We can then unregister or register the input event handler on the underlying touchscreen device accordingly. If we directly filter the underlying touchscreen device via the method outlined above, we loose this information as we can not get open() or close() events on the underlying device, therby we are forced to have the filter registerd at all times and power managment stopps working. -- Carl Philipp Klemm <philipp@xxxxxxxx> <carl@xxxxxxxx>Hi, > Care to summarize here too what's currently blocking touchscreen buttons > driver only providing the keyboard events as a separate inputdevice > without having to mirror all the touchscreen events? Of course. The touchscreen buttons driver has two primary tasks: it needs to provide the input device with the keyevents of the buttons and it needs to somehow prevent these surfaces generating regular touch events. Currently the driver provides these on 2 different input devices, one with just the key events and one with all the touch events mirrored execpt those that the driver wants to filter away. It would seam best to avoid implementing the second input device by using a filter in the input layer on the original touchscreen to filter away the unwanted events. The interface is however not conducive to this as it requires the filter immidatly decide to filter an event or not. This however is impossible as a touch on a button is the combination of many simultainus events and the touchscreen buttons driver cant possibly know wather to filter an event or not until the SYN report arrives. This could be worked around by filtering away all events and then retransmitting the ones we dont want to filter later using input_inject_event. The problem with this approch however is that we cant do this from the interrupt context, and doing this later via a workeque or similar mechanisum crates a very dificult to solve problem where we must somehow disable the filter until our events are passed to userpsace (to avoid retansmitting our own injected events in an endless loop) whilst not loosing any events that might come in at any time via an intterupt. The above problem is likely solvable, somehow, but solveing it would incure another problem: Registering an input filter counts as having the device open, therby the underlying touchscreen device can not sleep, this is unacceptable on the devices where this driver is intended to be used, as they have power budgets in the tens of mW. Right now this problem is solved by the fact that we can get events from the input layer on the mirrored touchscreen devce when userspace closes and opens this device. We can then unregister or register the input event handler on the underlying touchscreen device accordingly. If we directly filter the underlying touchscreen device via the method outlined above, we loose this information as we can not get open() or close() events on the underlying device, therby we are forced to have the filter registerd at all times and power managment stopps working. -- Carl Philipp Klemm <philipp@xxxxxxxx> <carl@xxxxxxxx> -- Carl Philipp Klemm <philipp@xxxxxxxx> <carl@xxxxxxxx> On Sat, 31 Oct 2020 10:37:22 +0100 Carl Philipp Klemm <philipp@xxxxxxxx> wrote: > Hi, > > > Care to summarize here too what's currently blocking touchscreen buttons > > driver only providing the keyboard events as a separate inputdevice > > without having to mirror all the touchscreen events? > > Of course. The touchscreen buttons driver has two primary tasks: it > needs to provide the input device with the keyevents of the buttons and > it needs to somehow prevent these surfaces generating regular touch > events. Currently the driver provides these on 2 different input > devices, one with just the key events and one with all the touch events > mirrored execpt those that the driver wants to filter away. > > It would seam best to avoid implementing the second input device by > using a filter in the input layer on the original touchscreen to filter > away the unwanted events. The interface is however not conducive to > this as it requires the filter immidatly decide to filter an event or > not. This however is impossible as a touch on a button is the > combination of many simultainus events and the touchscreen buttons > driver cant possibly know wather to filter an event or not until the > SYN report arrives. This could be worked around by filtering away all > events and then retransmitting the ones we dont want to filter later > using input_inject_event. The problem with this approch however is that > we cant do this from the interrupt context, and doing this later via a > workeque or similar mechanisum crates a very dificult to solve problem > where we must somehow disable the filter until our events are passed to > userpsace (to avoid retansmitting our own injected events in an endless > loop) whilst not loosing any events that might come in at any time via > an intterupt. > > The above problem is likely solvable, somehow, but solveing it would > incure another problem: Registering an input filter counts as having > the device open, therby the underlying touchscreen device can not > sleep, this is unacceptable on the devices where this driver is > intended to be used, as they have power budgets in the tens of mW. > Right now this problem is solved by the fact that we can get events > from the input layer on the mirrored touchscreen devce when userspace > closes and opens this device. We can then unregister or register the > input event handler on the underlying touchscreen device accordingly. > If we directly filter the underlying touchscreen device via the method > outlined above, we loose this information as we can not get open() or > close() events on the underlying device, therby we are forced to have > the filter registerd at all times and power managment stopps working. > > -- > Carl Philipp Klemm <philipp@xxxxxxxx> <carl@xxxxxxxx> >