[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Message-ID: <20230504042927.GA1129520@quokka>
Date: Thu, 4 May 2023 14:29:27 +1000
From: Peter Hutterer <peter.hutterer@...-t.net>
To: Jeff LaBundy <jeff@...undy.com>
Cc: Javier Carrasco <javier.carrasco@...fvision.net>,
Thomas Weißschuh <thomas@...ch.de>,
linux-input@...r.kernel.org, devicetree@...r.kernel.org,
linux-kernel@...r.kernel.org,
Dmitry Torokhov <dmitry.torokhov@...il.com>,
Rob Herring <robh+dt@...nel.org>,
Krzysztof Kozlowski <krzysztof.kozlowski+dt@...aro.org>,
Henrik Rydberg <rydberg@...math.org>,
Ulf Hansson <ulf.hansson@...aro.org>,
Hans Verkuil <hverkuil-cisco@...all.nl>,
Stephen Boyd <sboyd@...nel.org>,
Sebastian Reichel <sre@...nel.org>,
Linus Walleij <linus.walleij@...aro.org>,
Jonathan Cameron <Jonathan.Cameron@...wei.com>,
Uwe Kleine-g <u.kleine-koenig@...gutronix.de>,
Bastian Hecht <hechtb@...il.com>,
Michael Riesch <michael.riesch@...fvision.net>
Subject: Re: [RFC v1 0/4] Input: support virtual objects on touchscreens
On Thu, Apr 27, 2023 at 12:23:14PM -0500, Jeff LaBundy wrote:
> Hi Javier,
>
> On Thu, Apr 27, 2023 at 05:59:42PM +0200, Javier Carrasco wrote:
> > Hi,
> >
> > On 25.04.23 18:02, Jeff LaBundy wrote:
> > > Hi Thomas,
> > >
> > > On Tue, Apr 25, 2023 at 05:29:39PM +0200, Thomas Weißschuh wrote:
> > >> Hi Javier,
> > >>
> > >> On 2023-04-25 13:50:45+0200, Javier Carrasco wrote:
> > >>> Some touchscreens are shipped with a physical layer on top of them where
> > >>> a number of buttons and a resized touchscreen surface might be available.
> > >>>
> > >>> In order to generate proper key events by overlay buttons and adjust the
> > >>> touch events to a clipped surface, these patches offer a documented,
> > >>> device-tree-based solution by means of helper functions.
> > >>> An implementation for a specific touchscreen driver is also included.
> > >>>
> > >>> The functions in ts-virtobj provide a simple workflow to acquire
> > >>> physical objects from the device tree, map them into the device driver
> > >>> structures as virtual objects and generate events according to
> > >>> the object descriptions.
> > >>>
> > >>> This solution has been tested with a JT240MHQS-E3 display, which uses
> > >>> the st1624 as a touchscreen and provides two overly buttons and a frame
> > >>> that clips its effective surface.
> > >>
> > >> There are quite a few of notebooks from Asus that feature a printed
> > >> numpad on their touchpad [0]. The mapping from the touch events to the
> > >> numpad events needs to happen in software.
> > >
> > > That example seems a kind of fringe use-case in my opinion; I think the
> > > gap filled by this RFC is the case where a touchscreen has a printed
> > > overlay with a key that represents a fixed function.
> >
> > Exactly, this RFC addresses exactly such printed overlays.
> > >
> > > One problem I do see here is something like libinput or multitouch taking
> > > hold of the input device, and swallowing the key presses because it sees
> > > the device as a touchscreen and is not interested in these keys.
> >
> > Unfortunately I do not know libinput or multitouch and I might be
> > getting you wrong, but I guess the same would apply to any event
> > consumer that takes touchscreens as touch event producers and nothing else.
> >
> > Should they not check the supported events from the device instead of
> > making such assumptions? This RFC adds key events defined in the device
> > tree and they are therefore available and published as device
> > capabilities. That is for example what evtest does to report the
> > supported events and they are then notified accordingly. Is that not the
> > right way to do it?
>
> evtest is just that, a test tool. It's handy for ensuring the device emits
> the appropriate input events in response to hardware inputs, but it is not
> necessarily representative of how the input device may be used in practice.
ftr, I strongly recommend "libinput record" over evtest since it can be
replayed. And for libinput testing "libinput debug-events" to see what
comes out of libinput.
> I would encourage you to test this solution with a simple use-case such as
> Raspbian, and the virtual keys mapped to easily recognizable functions like
> volume up/down.
>
> Here, you will find that libinput will grab the device and declare it to be
> a touchscreen based on the input events it advertises. However, you will not
> see volume up/down keys are handled.
that would be a bug in libinput. libinput doesn't classify devices. It
uses *internal* backends but the backend for keyboard and touchscreen
devices is the same. So as long as your device advertises the various
EV_KEY and EV_ABS bit correctly, things should just work. If that's not
the case for a device please file a bug.
It's still "better" to split it up into different event nodes because
a lot of userspace may not be able to handle touchscreen+keyboard
devices but at least at the libinput level this shouldn't be a problem.
And the xf86-input-libinput driver splits up such devices at the X
level, so even where a device is touchscreen + keyboard you would end up
with two X devices with separate capabilities so they fit into the X
"everything is either a pointer or a keyboard" worldview.
Cheers,
Peter
Powered by blists - more mailing lists