Due to an influx of spam, we have had to impose restrictions on new accounts. Please see this wiki page for instructions on how to get full permissions. Sorry for the inconvenience.
Admin message
The migration is almost done, at least the rest should happen in the background. There are still a few technical difference between the old cluster and the new ones, and they are summarized in this issue. Please pay attention to the TL:DR at the end of the comment.
Some laptops, especially laptop-convertibles with 360° hinges like Lenovo Yoga series, usually contain a pair of two sensors: an accelerometer mounted at the screen, and a gyroscope or a second accelerometer mounted at the keyboard.
iio-sensor-proxy could be extended to also provide the second orientation value, so use-cases like tent or tablet mode can be detected and supported.
It also has to be made sure that iio-sensor-proxy uses correct accelerometer for screen orientation purpose (not sure if it does already - my laptop has a gyro instead of 2nd accel). A case where there's only gyro(s) and no accels could also be considered - not sure if there's some hardware like that though.
[edit]
I have played more and seems like the keyboard sensor isn't visible yet from Linux on my Yoga 2 Pro at all and the available gyroscope seems to come from the screen as well; however, some laptops do expose two accels already, see: https://github.com/alesguzik/linux_detect_tablet_mode so it could be supported by iio-sensor-proxy anyway.
Detecting tablet mode, or tent mode, is out of scope (and would be utterly unreliable if we did it based on the values of the accelerometers), and should be handled in the kernel and libinput, so that the keyboard and touchpad can be disabled appropriately.
About reporting the accelerometer reading values, see #166 (closed)
I don't think that's right. While the 100%-keyboard-behind-the-screen tablet mode is indeed kernel/libinput thing, as it's usually based on hall sensor similarly to the lid switch (and there is some code for handling it already there), in some devices there's also a full range of motion available with so-called tent and presentation modes that has to be detected using values from multiple accelerometers (/gyros/inclinometers/etc.) and is definitely out of kernel and libinput scope.
I probably made a mistake by mentioning the tablet mode as an example, just thought it's the simplest one :P
But yeah, #166 (closed) seems like the interesting one.
I don't think that's right. While the 100%-keyboard-behind-the-screen tablet mode is indeed kernel/libinput thing, as it's usually based on hall sensor similarly to the lid switch (and there is some code for handling it already there), in some devices there's also a full range of motion available with so-called tent and presentation modes that has to be detected using values from multiple accelerometers (/gyros/inclinometers/etc.) and is definitely out of kernel and libinput scope.
The kernel wouldn't have to interpret the accelerometer readings, the BIOS or firmware usually does, and exports it. That's how modes toggling are usually implemented in platform drivers in the kernel. Then libinput sees a switch.
Yes, the BIOS even turns off the keyboard and its backlight automatically for me without Linux noticing at all. However, that doesn't use accelerometers at all, at least on my device, and is only a binary switch that detects whether the tablet mode is fully engaged or not (and it can be toggled with a magnet put into right place).
For other modes, you have to use two accelerometers, since otherwise when the screen stays in the same position and only the keyboard moves (but doesn't close completely on either of the screen sides), it's undetectable, and yet you might want to disable the keyboard and touchpad when the laptop is standing on them.
@RussianNeuroMancer emailed me about re-considering supporting detecting
360 degree hinges style (aka yoga) 2-in-1s being folded into tablet-mode in iio-sensor-proxy.
Ideally all 2-in-1s would support SW_TABLET_MODE at the kernel level; and libinput already honors that to suppress builtin keyboard and touchpad events on yoga devices in tablet mode.
Recently I have been doing some kernel-work to make SW_TABLET_MODE be reported on a lot more 2-in-1s where the hardware/firmware offers some way to directly read this.
But on several popular budget yoga models, e.g. many Teclast models, many Trekstor models and the Irbis NB111, there is no way to directly get something akin to SW_TABLET_MODE from the device. Instead there are 2 accelerometers, 1 in the base and 1 in the display and the mode of the 2-in-1 needs to be derived from these.
Since iio-sensor-proxy already deals with at least 1 of the 2 accelerometers, it seems that iio-sensor-proxy is the best (least bad) place to add some code to detect the mode. This should then probably be exported through a uinput device which only delivers SW_TABLET_MODE events to integrate with the existing libinput support.
Which brings me to the goal of commenting here. @hadess, would you be willing to merge a (clean, reviewed) merge-request to add such support to iio-sensor-proxy ?
Note AFAIK nothing like this exists yet, but it might help motivating people to work on this if they know that you are not against adding such a thing.
e.g. many Teclast models, many Trekstor models and the Irbis NB111, there is no way to directly get something akin to SW_TABLET_MODE from the device. Instead there are 2 accelerometers, 1 in the base and 1 in the display and the mode of the 2-in-1 needs to be derived from these.
I have to add that most likely all (or almost all) Yoga-style devices by Irbis have same setup. From my experience with their laptops and tablets Irbis tend to release same hardware design as several different models with slight difference (battery size, TF or microSD card reader, USB Type-C or Type-A, HDMI or mini HDMI, etc.) but almost the same insides. So they have at least a dozen of transformers that similar to NB111. And there is most like few more dozens of other transformers by Irbis with similar accelerometer setup.
Also, Irbis NB111 hardware design is almost exact clone of JUMPER EZbook X1, so this one is also affected. I downloaded JUMPER EZbook X1 and verified that it's use same accelerometer setup and same method for disabling keyboard and touchpad (based on data from accelerometers).
But on several popular budget yoga models, e.g. many Teclast models, many Trekstor models and the Irbis NB111, there is no way to directly get something akin to SW_TABLET_MODE from the device. Instead there are 2 accelerometers, 1 in the base and 1 in the display and the mode of the 2-in-1 needs to be derived from these.
That's probably the least worst way of doing things, seeing as we already handle one of the accelerometers on those systems. I haven't looked at what rearchitecturing would be needed, but I imagine there'd be at least some to allow 2 accelerometers to be handled, even if only one is exported directly.
FWIW, the original bug was about, specifically, Lenovo's Yoga systems, not Yoga-likes which don't have the necessary firmware bits to handle this. We should probably have opened a new bug, but this one was generic enough...
So I have managed to get my hands on a laptop which uses 2 accelerometers for this.
And after some duckduckgo-ing I'm pretty sure that I also know how to calculate the angle between the 2 laptop halves.
I was wondering how this could be fit inside iio-sensor-proxy, and the answer seems be that this will be somewhat tricky. We should read both sensors at the same time (or as close in time as possible) for the calculations to have any accel-vectors caused by G-forces caused by movement of the laptop (which is likely to happen when folding it) cancel each other out as the accel-vector caused by the movement will be the same on both sensors.
But atm things are setup so that the timing of the reading is done by the accel-driver part of iio-sensor-proxy; and the accel-driver calls a callback into the core to report its reading. In this case we want a single function to call out into the drivers, get readings from both sensors and do its thing.
Which also brings me to another question, do we want the uinput device emulating the SW_TABLET_MODE functionality to be registered by the main iio-sensor-proxy process, or do we want a new separate helper process for this ?
I think a separate helper process might make more sense. With that said I've already been thinking a bit how we could fit this inside the main process. So let me make a braindump of that here, the way I think this could work in the main process is that the accel-drivers get a way to tell them to not do/trigger any readings by themselves and the accel-drivers get a poll() function to get readings from them without calling the normal ready-callback. Then the main code could detect when there are 2 accelerometers (with top/base markings in udev) and in that case tell the accel-drivers to stop doing their own thing and instead poll them from a timer running from the main code which then reads both and does both orientation stuff (using the display sensor readings) as well as do the angle calculation thing.
With all that said as I write it down it does not sound so nice and I personally think adding a new separate helper for this would be better.
Having a separate helper does bring questions with it of how to activate that helper. I guess the best way probably is to make it a hardware-activated service triggered based on the presence of iio-sensors with a certain ACPI id, as the 2-in-1 models which need this seem to use specific ACPI ids for this.
Some additional notes:
I first have to do some kernel work before I can look at the userspace side of this.
I just realized that the angle will be the same when fully closed and fully folded into tablet mode, so we will need to treat anything close to fully closed as tablet mode too.
Some of the ACPI ids mentioned are also used in models with only 1 accelerometer, this is not really a problem we can make the separate service exit cleanly if it does not find both accelerometers within a certain amount of time.
I just realized that the angle will be the same when fully closed and fully folded into tablet mode, so we will need to treat anything close to fully closed as tablet mode too.
Isn't taking orientation of both sensors into account could help to distinguish closed and folded states? Not in all cases I guess, but at least some cases it could help, right?
Which also brings me to another question, do we want the uinput device emulating the SW_TABLET_MODE functionality to be registered by the main iio-sensor-proxy process, or do we want a new separate helper process for this ?
I don't really want or need a separate process for that.
The TODO list, as I see it:
make all the drivers first-class GObjects, so they can be instantiated more than once
make the callbacks from drivers to core object signals, so there can be more than one listener
when two accelerometers are found, and we have a laptop/tablet that needs it (udev tag?), export a fake device through uinput. This portion of the code listens to the accelerometer events itself through signals, keeps the accelerometer polling alive.
I just realized that the angle will be the same when fully closed and fully folded into tablet mode, so we will need to treat anything close to fully closed as tablet mode too.
Isn't taking orientation of both sensors into account could help to distinguish closed and folded states? Not in all cases I guess, but at least some cases it could help, right?
That's "0 degrees" vs. "360 degrees", at least one of their axis should be going towards each other, or away from each other in that case, no?
I don't really want or need a separate process for that.
Ok. This will probably lead to some challenges wrt the systemd lockdown stuff though wrt accessing /dev/uinput though ? Hmm I just realized that iio-sensor-proxy already accessed /dev nodes for reading evdev based accelerometers so maybe this is a non issue.
The TODO list, as I see it:
make all the drivers first-class GObjects, so they can be instantiated more than once
Yes I noticed the static global drvdata thing, so I agree this needs to happen first. Writing gobject code is not something I've a lot of experience with though, so I would appreciate it if you could take care of this part.
make the callbacks from drivers to core object signals, so there can be more than one listener
Ack, but this still leaves the decision when to read the accelerometer in the driver, where as for calculating the angle between the 2 halves we really want 2 simultaneous readings.
That's "0 degrees" vs. "360 degrees", at least one of their axis should be going towards each other, or away from each other in that case, no?
That was my first instinct too, but no. That would also be weird because rotating something 360 degrees is a no-op really. For me it helps to think of it like this:
Think about the 360 degree hinges 2-in-1 lying on a table as a clamshell with the lid closed. That gives us an x and y vector of 0G for both the keyboard and the display and a z vector of 1G for the display and -1G for the keyboard half
(this assumes we see a Z vector "coming out of" the kbd / display as positive).
Now fold that (imaginary) 2 in one into tablet mode and lay it on the table with its display facing down (which you would normally never do) now we still have
an x and y vector of 0G for both the keyboard and the display and a z vector of 1G for the display and -1G for the keyboard half. IOW the 3d orientation of both is still the same. We have just changed the order in which the 2 halfs lie on the table. In clamshell mode the keyboard is the bottom half and in tablet mode with face-down the display is the bottom half.
So I think we need to do some heuristics for when the angle gets close to 0/360 degrees. For one we can look at the previous angle, which should cover dynamic changes. That does leave the question of what to do on startup (when the previous angle is unset) for that we could see if the display Z axis is clearly pointing downwards and then we are in clamshell mode with the lid being closed.
Note we could also involved the lid-switch, assuming that only reports closed when folded fully closed in clamshell mode and not when folded fully into tablet-mode.
This or we simply always report tablet mode when we are close to 0/360 degrees since the mode does not really matter when the lid is closed.
Anyways for now I plan to just write some quick and dirty c-program with hardcoded sensor paths to play with all this and then we can see from there.
Yes I noticed the static global drvdata thing, so I agree this needs to happen first. Writing gobject code is not something I've a lot of experience with though, so I would appreciate it if you could take care of this part.
It'll be a while before I can get to that, but I'll add it to my TODO list.
Ack, but this still leaves the decision when to read the accelerometer in the driver, where as for calculating the angle between the 2 halves we really want 2 simultaneous readings.
Not really that would just allow us to tell how far apart in time the readings are, where as what is really necessary (so as to be able to compare them properly) is for the readings to be done at the same time (iow immediately after each other).
Not really that would just allow us to tell how far apart in time the readings are, where as what is really necessary (so as to be able to compare them properly) is for the readings to be done at the same time (iow immediately after each other).
That's necessary to avoid using 2 readings that are too far apart from each other, even if they were read "immediately after each other". And as far as reading both accelerometers at the same time, we'd need to have the core drive the polling so things are in sync.
That is for different hardware then what we are discussing here. At least a whole bunch of Medion models and Teclast F5 and F6 series (and anything sharing similar designs) as well as the Lenovo Thinkpad Yoga 11e series (at least gen 3 and gen 4). Do not use the Intel integrated sensor hub at all. The models which I mention contain 2 simple accelerometers directly connected to i2c.
These Intel ISH connected angle-meters are a new thing, likely Intel preparing Linux support for upcoming models (probably because of ChromeOS). So yes we will need some sort of support for these in userspace too, but that support is necessary in addition to the support for models with 2 directly accessible accelerometers which we have been discussing here.
We'd still need a new sensor type to be exported. What parts of the system do you expect to consume the hinge sensor's data? libinput? upower?
I would expect libinput to consume this data, adding @whot to the discussion.
@whot, this discussion started as a discussion how to deal with 360 degree hinges style 2-in-1s with 2 accelerometers (one in the display and one in the base) in userspace. The plan is to make iio-sensor-proxy read both, calculate the angle between the 2 halves (as well as orientation in regards to the ground into account) and then export a SW_TABLET_MODE through a uinput device.
@hadess found some kernel patches, which add a new sensor-device to be found on some (upcoming?) yoga style 2-in-1s which directly exports the angle between the 2 halves. So now we are also discussing how to deal with this new case.
AFAIK what libinput really want to know is the mode the device is in, I don't think it cares about the actual angles.
ATM libinput only distinguishes between:
normal (aka laptop) mode
tablet-mode
At least some of the Thinkpad Yoga-s (and also some Acer models) can actually distinguish between 3 modes:
laptop-mode
tablet-mode
tent-mode
AFAIK tent-mode is really just a tablet on a stand, so that you don't need to hold it up. So atm at least I don't think there is a need to actually treat 2 and 3 differently (if there is we will need a new kernel API for this since in the Thinkpad Yoga and Acer case the tent-mode info is only known by the kernel driver).
So the way I think this should work is like this:
Have iio-sensor-proxy read the angle info, either read the 2 accelerometers and calculate it, or directly on the new devices
Have iio-sensor-proxy export SW_TABLET_MODE info based on the angle through an uinput device
If we ever feel the need to export tent-mode info separately then we would need a new SW_TENT_MODE switch (so that the kernel drivers can use this to) and then set both SW_TABLET_MODE=1 (for compat) and
SW_TENT_MODE=1 when in tent-mode.
iio-sensor-proxy can then support tent-mode by adding
SW_TENT_MODE support to the uinput device which it already registers
Where 3 and 4 are things which we may do in the future if necessary.
Thanks for the summary! The only reason libinput itself cares about the tablet mode is so it can disable the (built-in) keyboard/touchpad. This was necessary for some devices a few years back that generated ghost touches on the touchpad or where holding it like a tablet would inadvertently trigger key events.I'm not sure how common this is on modern devices but there were some a few years back that prompted this.
In fact, quite a few newer devices need a quirk in libinput to not disable the keyboard because some keys routed through the keyboard are in fact on-screen and still accessible in tablet mode.
For the tent mode, we don't need to disable devices so we could just ignore it. If we did disable devices it still wouldn't matter, you're not going to use the keyboard in that mode anyway. So from my POV tablet and tent mode are the same.
Besides this, all libinput does is forward the event on.