Split the detection of tablet devices, touch-enabled devices and keyboard-less devices
Categories
(Core :: Widget: Win32, task, P2)
Tracking
()
People
(Reporter: gsvelto, Unassigned)
References
(Blocks 1 open bug)
Details
(Whiteboard: [win:touch])
Follow-up from bug 1722208. In that bug I discovered that we conflate the concept of having a touch screen as the input device with having a tablet form-factor, i.e. not having a keyboard. As I discovered in bug 1722208 comment 14 this is a problem. We want to display the on-screen keyboard if we don't have a physical keyboard, and we want to change certain aspects of the layout if the primary input is a touch screen. Unfortunately the two concepts don't necessarily match, here's a few examples:
- A tablet with a keyboard attached, we want to show a tablet UI (because touch input) but no on-screen keyboard
- A convertible slate in tablet mode, we want to show a tablet UI and an on-screen keyboard (but it doesn't work before bug 1722208 on Windows 11 because Microsoft changed the semantics of the
UIViewSettings.UserInteractionMode
property, or maybe it's just buggy) - A desktop with a touch screen but no keyboard (think of an ATM or smart display), it's not a tablet but we most certainly want the tablet UI with larger buttons (because touch) and the on-screen keyboard (because no keyboard)
- A desktop with no touch screen and no keyboard but with a mouse attached. The layout would be the regular desktop one (because mouse) but we'd need to pop up an on-screen keyboard (because no keyboard). While this might seem like a contrived use-case I happen to know at least a user who can use a trackball or a touchpad but has significant problems using a keyboard.
How do we do this? Here's the laundry list:
- We implement a function to detect if a keyboard is present or not. Chrome has some code that enumerates all keyboards, we might use something like that. We don't want to copy everything they're doing because they're also checking if a device is a tablet and assuming it has no keyboard which would yield the wrong result as in bug 1722208.
- We modify the code that decides if we need to show the on-screen keyboard not to use
inTabletMode
anymore, but to use the function we defined in 1. - We audit the code that uses
inTabletMode
to be sure it really cares about the layout and not the touch input (this and this). We might even want to change the name of that property to something else such astouchDisplayDevice
. - We adjust telemetry to split the two concepts
- We implement mouse detection without using
IsTabletDevice()
because that's also subject to the cause of bug 1722208. Or we fixIstabeltDevice()
, more on this later. - We also adjust WinUtils::GetPrimaryPointerCapabilities() to not assume that a tablet has touch input as its primary input. It might have a mouse, and if the user plugged one in he's likely using it.
As for IsTabletDevice()
it's doing a lot of things which are similar to what Chrome does (see here and here). Chrome's version also uses Windows 10 tablet mode to override every other check so it has the same issue as ours. What we need to do is get hold of some devices and actually test what all those devices are returning for the various calls, to have an idea of what's (apparently) reliable and what's not to detect a tablet (or something that's behaving tablet-like).
It might sometimes be counterintuitive. For example this returns false on my device when it's in tablet mode on Windows 11. That's because the AR_LAPTOP
bit appears to be set but it shouldn't be (so maybe it's the same underlying issue causing the UIViewSettings.UserInteractionMode
to be wrong). Additionally there are desktop screens that can be rotated in portrait mode and they probably support auto-rotation... but they are most definetely not tablets.
Updated•2 years ago
|
Updated•2 years ago
|
Comment 1•2 years ago
|
||
Probably not a S2
Reminder, S2 means: (Serious) Major functionality/product severely impaired or a high impact issue and a satisfactory workaround does not exist
Description
•