Using libinput-debug-events with a touchscreen
I have a PC laptop with a touchscreen, however I could not find any suitable application to make full use of it : up to now, every touch is recognized as a mouse 1 button, several fingers taps, drag or pinch are not interpreted.
I tried "touchegg" which catches all the touch events and is able to interpret thems but it seems to be no more maintained and segfaults quite oftenly.
I ended up trying "libinput-debug-events" which catches all the touchscreen events. Based on this program, I would like to develop one which would add the gesture interpretation.
It seems to me quite easy as "libinput-debug-events" records the position and time events of every touch, so as an example, if it records several taps within a few milliseconds, that could be interpreted as a 2,3,4 or 5 finger touch.
However, I have no experience in development with libinput. I would need a starting point.
The first step would be to compile an equivalent program as "libinput-debug-events" dedicated to the touchscreen. Then, I would like to develop the gesture recognition. At last, I would have to implement how, based on the recognized gesture, to send a keystroke to the currently focused application, as an example, 5 finger taps would mean "Ctrl+C" or "Ctrl+Q", i.e. kill/close the current application and its focused window.
I love computer development, it is also a nice way for me to get in.
Thanks for your attention.