[<prev] [next>] [<thread-prev] [day] [month] [year] [list]
Message-ID: <ZFad5ywOpGm+zQLS@nixie71>
Date: Sat, 6 May 2023 13:35:19 -0500
From: Jeff LaBundy <jeff@...undy.com>
To: Peter Hutterer <peter.hutterer@...-t.net>
Cc: Javier Carrasco <javier.carrasco@...fvision.net>,
Thomas Weißschuh <thomas@...ch.de>,
linux-input@...r.kernel.org, devicetree@...r.kernel.org,
linux-kernel@...r.kernel.org,
Dmitry Torokhov <dmitry.torokhov@...il.com>,
Rob Herring <robh+dt@...nel.org>,
Krzysztof Kozlowski <krzysztof.kozlowski+dt@...aro.org>,
Henrik Rydberg <rydberg@...math.org>,
Ulf Hansson <ulf.hansson@...aro.org>,
Hans Verkuil <hverkuil-cisco@...all.nl>,
Stephen Boyd <sboyd@...nel.org>,
Sebastian Reichel <sre@...nel.org>,
Linus Walleij <linus.walleij@...aro.org>,
Jonathan Cameron <Jonathan.Cameron@...wei.com>,
Uwe Kleine-g <u.kleine-koenig@...gutronix.de>,
Bastian Hecht <hechtb@...il.com>,
Michael Riesch <michael.riesch@...fvision.net>
Subject: Re: [RFC v1 0/4] Input: support virtual objects on touchscreens
Hi Peter and Javier,
On Thu, May 04, 2023 at 02:29:27PM +1000, Peter Hutterer wrote:
> On Thu, Apr 27, 2023 at 12:23:14PM -0500, Jeff LaBundy wrote:
> > Hi Javier,
> >
> > On Thu, Apr 27, 2023 at 05:59:42PM +0200, Javier Carrasco wrote:
> > > Hi,
> > >
> > > On 25.04.23 18:02, Jeff LaBundy wrote:
> > > > Hi Thomas,
> > > >
> > > > On Tue, Apr 25, 2023 at 05:29:39PM +0200, Thomas Weißschuh wrote:
> > > >> Hi Javier,
> > > >>
> > > >> On 2023-04-25 13:50:45+0200, Javier Carrasco wrote:
> > > >>> Some touchscreens are shipped with a physical layer on top of them where
> > > >>> a number of buttons and a resized touchscreen surface might be available.
> > > >>>
> > > >>> In order to generate proper key events by overlay buttons and adjust the
> > > >>> touch events to a clipped surface, these patches offer a documented,
> > > >>> device-tree-based solution by means of helper functions.
> > > >>> An implementation for a specific touchscreen driver is also included.
> > > >>>
> > > >>> The functions in ts-virtobj provide a simple workflow to acquire
> > > >>> physical objects from the device tree, map them into the device driver
> > > >>> structures as virtual objects and generate events according to
> > > >>> the object descriptions.
> > > >>>
> > > >>> This solution has been tested with a JT240MHQS-E3 display, which uses
> > > >>> the st1624 as a touchscreen and provides two overly buttons and a frame
> > > >>> that clips its effective surface.
> > > >>
> > > >> There are quite a few of notebooks from Asus that feature a printed
> > > >> numpad on their touchpad [0]. The mapping from the touch events to the
> > > >> numpad events needs to happen in software.
> > > >
> > > > That example seems a kind of fringe use-case in my opinion; I think the
> > > > gap filled by this RFC is the case where a touchscreen has a printed
> > > > overlay with a key that represents a fixed function.
> > >
> > > Exactly, this RFC addresses exactly such printed overlays.
> > > >
> > > > One problem I do see here is something like libinput or multitouch taking
> > > > hold of the input device, and swallowing the key presses because it sees
> > > > the device as a touchscreen and is not interested in these keys.
> > >
> > > Unfortunately I do not know libinput or multitouch and I might be
> > > getting you wrong, but I guess the same would apply to any event
> > > consumer that takes touchscreens as touch event producers and nothing else.
> > >
> > > Should they not check the supported events from the device instead of
> > > making such assumptions? This RFC adds key events defined in the device
> > > tree and they are therefore available and published as device
> > > capabilities. That is for example what evtest does to report the
> > > supported events and they are then notified accordingly. Is that not the
> > > right way to do it?
> >
> > evtest is just that, a test tool. It's handy for ensuring the device emits
> > the appropriate input events in response to hardware inputs, but it is not
> > necessarily representative of how the input device may be used in practice.
>
> ftr, I strongly recommend "libinput record" over evtest since it can be
> replayed. And for libinput testing "libinput debug-events" to see what
> comes out of libinput.
>
> > I would encourage you to test this solution with a simple use-case such as
> > Raspbian, and the virtual keys mapped to easily recognizable functions like
> > volume up/down.
> >
> > Here, you will find that libinput will grab the device and declare it to be
> > a touchscreen based on the input events it advertises. However, you will not
> > see volume up/down keys are handled.
>
> that would be a bug in libinput. libinput doesn't classify devices. It
> uses *internal* backends but the backend for keyboard and touchscreen
> devices is the same. So as long as your device advertises the various
> EV_KEY and EV_ABS bit correctly, things should just work. If that's not
> the case for a device please file a bug.
Please accept my apology for spreading misinformation; the sighting occurred
some time ago and I appear to have mixed up some observations.
I recreated my original issue just now and the problem is actually with LIRC,
which in this case is presenting the hybrid input device to VLC media player
as a remote control.
Prior to launching VLC media player, both touchscreen movement and key events
are handled just fine. Once VLC media player launches and LIRC begins handling
the key events, however, all touchscreen functionality is lost.
Upon closer inspection, it seems that LIRC creates another input device called
"lircd bypass" which relays the "left over" (i.e. touchscreen) events. However,
it seems LIRC does not copy the axis limits, so libinput rightfully rejects
the new device since min ABS_X = max ABS_X = 0.
Therefore, please ignore my sighting with regard to this RFC; it is neither a
bug in libinput nor a valid argument in shaping this RFC. This instead seems
like a possible bug in LIRC, so I will report it there.
>
> It's still "better" to split it up into different event nodes because
> a lot of userspace may not be able to handle touchscreen+keyboard
> devices but at least at the libinput level this shouldn't be a problem.
I still agree; if nothing else, for the ability to inhibit different functions
at a more granular level. Therefore it seems best that patch [1/4] not mandate
the two input devices to be the same, which it doesn't appear to do anyway.
That being said, Javier, feel free to disregard my suggestion that the input
devices in patch [3/4] remain separate. Sorry for the churn; this was still
very helpful for me at least :)
>
> And the xf86-input-libinput driver splits up such devices at the X
> level, so even where a device is touchscreen + keyboard you would end up
> with two X devices with separate capabilities so they fit into the X
> "everything is either a pointer or a keyboard" worldview.
>
> Cheers,
> Peter
>
Kind regards,
Jeff LaBundy
Powered by blists - more mailing lists