23.18 KB, patch
|Details | Diff | Splinter Review|
There are a lot of gestures commonly used on mobile devices to favor more direct forms of interactions. They've become so intuitive that you probably use them without thinking about it: - Double tap to zoom - Drag/flick to scroll - Press and hold to display options (a bit like right click) Of course, some of them are hard to reproduce with a mouse. I don't really have a scenario with expected/actual results, but I believe we should support this in our RDM touch simulation mode. Chrome does, and it would really enhance the "mobile feeling" of the tool.
I've looked up the current code for touch simulation, and almost everything is done in the simulator-content script . This JS file is loaded in the viewport by the RDM code using the message manager . The script basically captures every mouse event, and translates them into touch events when the simulator is enabled. We could detect specific events like double click = double tap = force the viewport to zoom, but maybe this is already done somewhere in the codebase, for mobile devices, and we could just trick the platform into thinking this viewport is really on a mobile device. Perhaps we should use other functions like InjectTouchEvent to force that behaviour?  I'm not sure what the right approach here. : https://dxr.mozilla.org/mozilla-central/source/devtools/shared/touch/simulator-content.js : https://dxr.mozilla.org/mozilla-central/source/devtools/shared/touch/simulator.js#28
Oops, forgot the link related to my previous comment. : https://dxr.mozilla.org/mozilla-central/source/dom/ipc/TabParent.cpp#2826
Priority: -- → P3
Whiteboard: [multiviewport][triage] → [multiviewport] [reserve-rdm]
Created attachment 8773305 [details] [diff] [review] Part 2: Allow per-document control of the meta-viewport preference
Created attachment 8773306 [details] [diff] [review] Part 3: Adapt RDM to fully support touch simulation
Okay, here's my latest progress on bringing real touch simulation in RDM. As mentioned in previous comments, the goal here is to use the platform just like on a mobile device, instead of trying to imitate the mobile behaviour with a frame script. There are 3 parts in this patch: - Part 1: Vivien Nicolas fixed injectTouchEvent for me, it's an old API used for B2G and provides an easy way to test "real" touch events. However, it's deprecated and we should use SendNativeTouchPoint as soon as it's available on all platforms (see bug 1288187, I'm discussing this with kats). For obscure thread reasons, injectTouchEvent tends to crash a lot, so I had to add a 500ms delay between the mouse events and the resulting touch actions (hopefully this will be removed when we use SendNativeTouchPoint). - Part 2 adds an attribute in the docshell to manage the meta-viewport preference tab by tab, just like dom.w3c_touch_events.enabled in Bug 970346. This was needed because setting the preference for the whole browser breaks most of the UI. - Part 3 mostly mods the touch simulator in the RDM to use this new method. There is an invisible div above the iframe that I called the "touch proxy", and it captures every mouse event to call injectTouchEvent when needed. I set/remove the meta-viewport docshell attribute, and a few other preferences. A lot of useless code in the simulator script is also removed in this part. This is still WIP, but it already does some cool things: you can double tap to zoom, drag to scroll, long press to open contextual menu, etc. The touch proxy thing also resolves Bug 1282084 (no more hover events), and the meta-viewport preference fixes a lot of websites that weren't rendered properly in RDM before. We can also add fluffing easily with this approach (accept clicks even when there aren't exactly on the target), using ui.touch.radius preferences, and this would justify Bug 1271728. All suggestions/advices are welcome!
Assignee: nobody → bchabod
Status: NEW → ASSIGNED
Since Benoit's internship has ended, I am going to assume he may not be working on this anymore. If that's incorrect, feel free to pick it up again!
Assignee: be.chabod → nobody
Status: ASSIGNED → NEW
Flags: qe-verify? → qe-verify+
See Also: → bug 1318662
According to comment 6 it seems Benoit was pretty close to finish the patch. Based on the duplicates, people on CC and questions in forums and other pages, I assume this is a feature many people are missing. Would someone of the team be willing to finish this up? Sebastian
giving that more tablets devices are being used, this feature is getting more desirable.
Please avoid using needinfo (and Bugzilla generally) just to state an opinion on a feature's importance. This is one feature of thousands that the DevTools team prioritizes. We regularly revisit our feature backlog to decide what to work on next. We do still want feedback though, and Discourse is a good place for that, along with Twitter, IRC, Slack, etc. : https://discourse.mozilla.org/c/devtools
Assignee: nobody → mtigley
Status: NEW → ASSIGNED
Comment on attachment 8773304 [details] [diff] [review] Part 1: Fix platform InjectTouchEvent function This is in the tree in nsDOMWindowUtils::SendTouchEventCommon. First introduced in Bug 603008, pre-dating this patch. That seems to indicate that the existing pathway is adequate.
Attachment #8773304 - Attachment is obsolete: true
Comment on attachment 8773305 [details] [diff] [review] Part 2: Allow per-document control of the meta-viewport preference Meta viewport is enabled if and only if touch simulation is enabled, since Bug 1290420 landed. This patch is no longer needed to force meta viewport, since any touch gestures will, by definition, have touch simulation enabled.
You need to log in before you can comment on or make changes to this bug.