Touch simulation doesn't support mobile gestures

NEW
Unassigned

Status

()

Firefox
Developer Tools: Responsive Design Mode
P3
normal
a year ago
9 hours ago

People

(Reporter: bigben, Unassigned)

Tracking

(Blocks: 2 bugs)

Trunk
Points:
---
Dependency tree / graph
Bug Flags:
qe-verify +

Firefox Tracking Flags

(firefox50 affected)

Details

(Whiteboard: [multiviewport] [reserve-rdm])

Attachments

(3 attachments)

(Reporter)

Description

a year ago
There are a lot of gestures commonly used on mobile devices to favor more direct forms of interactions. They've become so intuitive that you probably use them without thinking about it:
- Double tap to zoom
- Drag/flick to scroll
- Press and hold to display options (a bit like right click)

Of course, some of them are hard to reproduce with a mouse. I don't really have a scenario with expected/actual results, but I believe we should support this in our RDM touch simulation mode.

Chrome does, and it would really enhance the "mobile feeling" of the tool.
(Reporter)

Comment 1

a year ago
I've looked up the current code for touch simulation, and almost everything is done in the simulator-content script [1]. This JS file is loaded in the viewport by the RDM code using the message manager [2].
The script basically captures every mouse event, and translates them into touch events when the simulator is enabled.

We could detect specific events like double click = double tap = force the viewport to zoom, but maybe this is already done somewhere in the codebase, for mobile devices, and we could just trick the platform into thinking this viewport is really on a mobile device.

Perhaps we should use other functions like InjectTouchEvent to force that behaviour? [3]
I'm not sure what the right approach here.

[1]: https://dxr.mozilla.org/mozilla-central/source/devtools/shared/touch/simulator-content.js
[2]: https://dxr.mozilla.org/mozilla-central/source/devtools/shared/touch/simulator.js#28
(Reporter)

Comment 2

a year ago
Oops, forgot the link related to my previous comment.
[3]: https://dxr.mozilla.org/mozilla-central/source/dom/ipc/TabParent.cpp#2826
Whiteboard: [multiviewport][triage]
Priority: -- → P3
Whiteboard: [multiviewport][triage] → [multiviewport] [reserve-rdm]
(Reporter)

Updated

a year ago
Depends on: 1288187
(Reporter)

Updated

a year ago
See Also: → bug 1285566
(Reporter)

Comment 3

a year ago
Created attachment 8773304 [details] [diff] [review]
Part 1: Fix platform InjectTouchEvent function
(Reporter)

Comment 4

a year ago
Created attachment 8773305 [details] [diff] [review]
Part 2: Allow per-document control of the meta-viewport preference
(Reporter)

Comment 5

a year ago
Created attachment 8773306 [details] [diff] [review]
Part 3: Adapt RDM to fully support touch simulation
(Reporter)

Comment 6

a year ago
Okay, here's my latest progress on bringing real touch simulation in RDM.
As mentioned in previous comments, the goal here is to use the platform just like on a mobile device, instead of trying to imitate the mobile behaviour with a frame script.

There are 3 parts in this patch:

- Part 1: Vivien Nicolas fixed injectTouchEvent for me, it's an old API used for B2G and provides an easy way to test "real" touch events. However, it's deprecated and we should use SendNativeTouchPoint as soon as it's available on all platforms (see bug 1288187, I'm discussing this with kats). For obscure thread reasons, injectTouchEvent tends to crash a lot, so I had to add a 500ms delay between the mouse events and the resulting touch actions (hopefully this will be removed when we use SendNativeTouchPoint).

- Part 2 adds an attribute in the docshell to manage the meta-viewport preference tab by tab, just like dom.w3c_touch_events.enabled in Bug 970346. This was needed because setting the preference for the whole browser breaks most of the UI.

- Part 3 mostly mods the touch simulator in the RDM to use this new method.
There is an invisible div above the iframe that I called the "touch proxy", and it captures every mouse event to call injectTouchEvent when needed. I set/remove the meta-viewport docshell attribute, and a few other preferences. A lot of useless code in the simulator script is also removed in this part.

This is still WIP, but it already does some cool things: you can double tap to zoom, drag to scroll, long press to open contextual menu, etc. The touch proxy thing also resolves Bug 1282084 (no more hover events), and the meta-viewport preference fixes a lot of websites that weren't rendered properly in RDM before.

We can also add fluffing easily with this approach (accept clicks even when there aren't exactly on the target), using ui.touch.radius preferences, and this would justify Bug 1271728.

All suggestions/advices are welcome!
Assignee: nobody → bchabod
Status: NEW → ASSIGNED
Depends on: 1289435
(Reporter)

Updated

a year ago
Depends on: 1289432
(Reporter)

Updated

a year ago
See Also: → bug 1290420
Flags: qe-verify?
Since Benoit's internship has ended, I am going to assume he may not be working on this anymore.  If that's incorrect, feel free to pick it up again!
Assignee: be.chabod → nobody
Status: ASSIGNED → NEW
Flags: qe-verify? → qe-verify+
QA Contact: mihai.boldan
See Also: → bug 1318662
Duplicate of this bug: 1336023
See Also: → bug 1378591
Blocks: 1271728
Blocks: 1401304
Duplicate of this bug: 1417724
You need to log in before you can comment on or make changes to this bug.