Closed Bug 927172 Opened 11 years ago Closed 11 years ago

Can't touch scroll when second monitor is attached - event coordinate origin issues with dual monitors

Categories

(Firefox for Metro Graveyard :: Pan and Zoom, defect)

x86_64
Windows 8
defect
Not set
normal

Tracking

(Not tracked)

RESOLVED FIXED
Firefox 28

People

(Reporter: rsilveira, Assigned: rsilveira)

References

Details

(Whiteboard: [block28])

Attachments

(1 file, 3 obsolete files)

Touch scrolling does not work with a second (non-touch) monitor attached to my carbon X1. It was definitely working a couple weeks back, I've always had a second monitor attached.
It also ignores double & single taps. My console output indicates that the apzController is reporting that it is ignoring all touch input. It was definitely working before I went on pto.
I'm unable to reproduce this. I have a Surface Pro, connected to an external Dell monitor via displayport cable. I have selected the "extend" mode for display mirroring (i.e. the display is not mirrored but I have a second desktop). I can start FxMetro on the surface display, load pages, and pan around fine. The second monitor just shows the default windows background/desktop the whole time. Is there something else I should try to repro this?
I tried on my surface pro running windows 8.1 on the same external monitor that failed on the Carbon X1 and couldn't reproduce either.
I get correct behavior on the surface pro in extend mode, but it was on a different monitor if that matters. :kats my headache requires having visual studio running and attached to the immersive metrofx. I am wondering if windows updates might be behind some of this.
Ah, now I was able to narrow it down more and reproduce on the surface pro too. You also need to set the external monitor as your "main display" in the screen resolution settings. You can then drag any running metro application from the external monitor to the laptop screen and metro apps will open there from then on. When I do that on the surface, I actually get half size window rendered on the top left corner of the screen, although the touch responds as if it was rendered full-screen. It seems like it's looking for dpi information on the external monitor instead of the one where it's being rendered. Maybe we also disable apz when hardware is not touch enabled and it's trying to detect on the main display only?
Is this still a valid bug or did we figure out this was by design? > When I do that on the surface, I actually get half size window rendered on > the top left corner of the screen, although the touch responds as if it was > rendered full-screen. It seems like it's looking for dpi information on the > external monitor instead of the one where it's being rendered. Maybe we also > disable apz when hardware is not touch enabled and it's trying to detect on > the main display only? Rodrigo, mind filing a follow up on this, don't think it belongs here.
It's still valid. I don't think it's by design because it works when apz is off. I'll file a follow up for the surface pro issue after I try it with apz disabled.
(In reply to Rodrigo Silveira [:rsilveira] from comment #7) > It's still valid. I don't think it's by design because it works when apz is > off. I'll file a follow up for the surface pro issue after I try it with apz > disabled. You might try putting a few break points or enabling logging in MetroInput.cpp to see if we're getting events vs. something going wrong with routing events to the right apz.
Attached patch Hack (obsolete) — Splinter Review
The issue is related to mixed coordinate systems as mbrubeck suspected. My setup has the laptop touch screen to the left of the external monitor like [1]. The hit test fails at [2] because the frame metrics composition bounds were taking the monitor origin into consideration - so visibleRegionContains expects the x position to be between -1600 and 0 in my case. However IPointerPoint position is returning a positive number at [3], so the hit test always fails. Either we convert the positions to virtual screen position or we for the origin to 0,0 when doing the hit test. This hack fixes it by forcing the origin to 0,0 just because it was easier. When using a mouse I saw both positive and negative coords at some point. It needs some more investigation. Both mouse and touch work fine with this fix. [1] http://msdn.microsoft.com/en-us/library/windows/desktop/dd145136%28v=vs.85%29.aspx [2] https://mxr.mozilla.org/mozilla-central/source/gfx/layers/composite/APZCTreeManager.cpp#672 [3] https://mxr.mozilla.org/mozilla-central/source/widget/windows/winrt/MetroInput.cpp#71
Assignee: nobody → rsilveira
Status: NEW → ASSIGNED
So I've never had dual monitors or debugged anything related so I'm not sure how win32 widget deals with this. Feels though like we should have coordinates that are normalized to an origin of 0,0, although we share code with the win32 module now so this might present problems down the road. I'm not sure what modules like WinMouseScrollHandler do in this situation, but we need to match them since they are shared.
Summary: Can't touch scroll when second monitor is attached → Can't touch scroll when second monitor is attached - event coordinate origin issues with dual monitors
Just to be sure I understand, the mCompositionBounds reflects the coordinates of the monitor inside the "virtual screen" coordinate space? (i.e. in your scenario with external monitor set as primary and to the right of your touch screen, the touch screen has negative coordinates). And the touch events you're receiving are relative to the actual monitor they're being received from? Is there a field on the touch events that indicate which monitor they're coming from, and what the monitor coordinates are in the virtual screen space? What about mouse events, are they relative to the monitor they're being received from, or are they relative to the virtual screen space?
(In reply to Kartikaya Gupta (email:kats@mozilla.com) from comment #11) > Just to be sure I understand, the mCompositionBounds reflects the > coordinates of the monitor inside the "virtual screen" coordinate space? > (i.e. in your scenario with external monitor set as primary and to the right > of your touch screen, the touch screen has negative coordinates). Yes, precisely. > And the touch events you're receiving are relative to the actual monitor > they're being received from? Righg IPointerPoint::Position is returning a position relative to the monitor, so it's always positive. https://mxr.mozilla.org/mozilla-central/source/widget/windows/winrt/MetroInput.cpp#71 > Is there a field on the touch events that indicate which monitor they're > coming from, and what the monitor coordinates are in the virtual screen > space? When debugging I saw a private property called _origin that had the virtual screen origin. But that's not accessible from anywhere I could see: http://msdn.microsoft.com/en-us/library/windows/apps/windows.ui.input.pointerpoint.aspx > What about mouse events, are they relative to the monitor they're being > received from, or are they relative to the virtual screen space? I saw both negative and positive number when debugging mouse events. I'll investigate further to see where they're coming from.
(In reply to Rodrigo Silveira [:rsilveira] from comment #12) > > What about mouse events, are they relative to the monitor they're being > > received from, or are they relative to the virtual screen space? > > I saw both negative and positive number when debugging mouse events. I'll > investigate further to see where they're coming from. OK, now I figured out the negative mouse positions. All events - touch and mouse - are using client coords (meaning they're always positive in MetroInput). The negative values were showing up after the transformation at [1]. That looks like a (separate) bug to me. I'm convinced that the right fix is to make the hit test consider client coords instead of the virtual one. Not sure how though. Ideas? [1] http://dxr.mozilla.org/mozilla-central/source/gfx/layers/composite/APZCTreeManager.cpp#l661
(In reply to Rodrigo Silveira [:rsilveira] from comment #13) > (In reply to Rodrigo Silveira [:rsilveira] from comment #12) > > > What about mouse events, are they relative to the monitor they're being > > > received from, or are they relative to the virtual screen space? > > > > I saw both negative and positive number when debugging mouse events. I'll > > investigate further to see where they're coming from. > > OK, now I figured out the negative mouse positions. All events - touch and > mouse - are using client coords (meaning they're always positive in > MetroInput). The negative values were showing up after the transformation at > [1]. That looks like a (separate) bug to me. > > I'm convinced that the right fix is to make the hit test consider client > coords instead of the virtual one. Not sure how though. Ideas? > > [1] > http://dxr.mozilla.org/mozilla-central/source/gfx/layers/composite/ > APZCTreeManager.cpp#l661 Can you sum this up a bit for us? I'm still a little confused. My understanding: event coordinates are in the client (window) space, but there are values in FrameMetrics (specifically mCompositionBounds) that somehow end up with virtual desktop coordinates?
We should look at the coordinate space of the returned rects by various nsIWidget methods related to bounds - http://mxr.mozilla.org/mozilla-central/source/widget/windows/winrt/MetroWidget.cpp#360 Maybe we are priming things with the wrong coordinate space.
Whiteboard: [block28]
(In reply to Jim Mathies [:jimm] from comment #14) > Can you sum this up a bit for us? I'm still a little confused. My > understanding: event coordinates are in the client (window) space, but there > are values in FrameMetrics (specifically mCompositionBounds) that somehow > end up with virtual desktop coordinates? Yes your understanding is correct. All inputs are being generated with screen coordinates. mCompositionBounds ends up with virtual desktop coordinates, causing APZ hit test to fail. Clicks and taps still work fine, it's just touch scrolling that fails. Now new developments: the virtual screen coordinates in mCompositionBounds are being introduced by the GetBounds methods in MetroWidget.cpp that you linked above. Changing the rect x and y to 0 on the return from GetBounds fixes it. The other two get*bound methods GetScreenBounds and GetClientBounds also return virtual screen coords but keeping them as is or forcing them into screen coords didn't seem to make any difference.
(In reply to Rodrigo Silveira [:rsilveira] from comment #16) > (In reply to Jim Mathies [:jimm] from comment #14) > > Can you sum this up a bit for us? I'm still a little confused. My > > understanding: event coordinates are in the client (window) space, but there > > are values in FrameMetrics (specifically mCompositionBounds) that somehow > > end up with virtual desktop coordinates? > > Yes your understanding is correct. All inputs are being generated with > screen coordinates. mCompositionBounds ends up with virtual desktop > coordinates, causing APZ hit test to fail. Clicks and taps still work fine, > it's just touch scrolling that fails. > > Now new developments: the virtual screen coordinates in mCompositionBounds > are being introduced by the GetBounds methods in MetroWidget.cpp that you > linked above. Changing the rect x and y to 0 on the return from GetBounds > fixes it. The other two get*bound methods GetScreenBounds and > GetClientBounds also return virtual screen coords but keeping them as is or > forcing them into screen coords didn't seem to make any difference. Sweet. MS doesn't actually tell us what the coordinates are that we retrieve - http://msdn.microsoft.com/en-us/library/windows/apps/windows.ui.core.icorewindow.bounds.aspx http://mxr.mozilla.org/mozilla-central/source/widget/windows/winrt/FrameworkView.cpp#290 http://mxr.mozilla.org/mozilla-central/source/widget/windows/winrt/FrameworkView.cpp#290 Sounds like we need to pull these values from a win32 api instead, GetClientRect should work - http://msdn.microsoft.com/en-us/library/windows/desktop/ms633503%28v=vs.85%29.aspx The hwnd is kept over in MetroWidget.
For the record I think it might make sense to keep everything in virtual screen coordinates because someday down the road we might have to deal with the case where the Firefox window is split across a couple of monitors. Even in that case I guess if we have one monitor as being the "main" monitor and use it's top-left corner as the (0,0) point, we can get away with it, but in that case we'll still be having negative values for touch events and so on. It's gonna be a mess either way. :(
Attached patch Using GetClientRect (obsolete) — Splinter Review
Switched to using GetClientRect() instead. It doesn't work when I put the logic in SetDpi as mWidget is not yet set when it's first called. I'm setting mWindowBounds in SetWidget to be sure we have access to the HWND. It's working fine for just opening the browser and scrolling the start screen. Where should this go?
Attachment #824455 - Attachment is obsolete: true
Attachment #825589 - Flags: feedback?(jmathies)
(In reply to Rodrigo Silveira [:rsilveira] from comment #19) > Created attachment 825589 [details] [diff] [review] > Using GetClientRect > > Switched to using GetClientRect() instead. It doesn't work when I put the > logic in SetDpi as mWidget is not yet set when it's first called. I'm > setting mWindowBounds in SetWidget to be sure we have access to the HWND. > It's working fine for just opening the browser and scrolling the start > screen. > > Where should this go? I think this is fine. We can receive early set dpi calls before the widget exists. SetWidget gets called from MetroWidget::Create so that's the first time we would need bounds to be valid.
Attachment #825589 - Flags: feedback?(jmathies) → feedback+
(In reply to Kartikaya Gupta (email:kats@mozilla.com) from comment #18) > For the record I think it might make sense to keep everything in virtual > screen coordinates because someday down the road we might have to deal with > the case where the Firefox window is split across a couple of monitors. Even > in that case I guess if we have one monitor as being the "main" monitor and > use it's top-left corner as the (0,0) point, we can get away with it, but in > that case we'll still be having negative values for touch events and so on. > It's gonna be a mess either way. :( Updating metro widget to support something like that wouldn't be too hard although I wonder how well platform would deal with it. For the time being though, as long as the metro interface is kept on a single display, I think normalizing to client coords will make coding/debugging simpler for us over the short term. Thankfully we don't have to deal with child windows/popups. We have a number of dual monitor / multiple windows bugs filed on this type of set up for desktop. It's a bit of a mess.
Attached patch Patch v1 (obsolete) — Splinter Review
Separate the code into a function and calling it from where it seemed necessary.
Attachment #825589 - Attachment is obsolete: true
Attachment #826063 - Flags: review?(jmathies)
Attachment #826063 - Flags: review?(jmathies) → review+
Status: ASSIGNED → RESOLVED
Closed: 11 years ago
Resolution: --- → FIXED
Status: RESOLVED → REOPENED
Resolution: FIXED → ---
Wonder if it's related to high-DPI or conflict with some other change. I'm building on the surface pro to investigate.
Maybe the early call in SetDPI worked with mWindow->get_Bounds(&logicalBounds), but with the patch we had to defer until we had a widget. Although since widget is what needs this data and you called it in SetWidget not sure why things break. Some logging in the get bounds calls with the patch applied w/a debugger attached early to pick them up might shed some light on things. You can use our handy delayed startup define in the app to get attached - http://mxr.mozilla.org/mozilla-central/source/browser/app/nsBrowserApp.cpp#34
Attached patch High DPI fixSplinter Review
As reported by Jimm on email: "Start tiles run down off the screen and the nav bar won’t display. Also options panels won’t come up when selection from the charms menu." I was able to reproduce this on the surface. It affects high-DPI devices only and was happening because GetClientRect() returns the physical dimensions of the screen and I was calling LogToPhys() on it unecessarily. I'm debugging some other issues with dual monitor on high DPI, once I'm clear on those I'll r?.
Attachment #826063 - Attachment is obsolete: true
Comment on attachment 827066 [details] [diff] [review] High DPI fix This fixes the HighDPI issue reported by Jimm. There are still issues with dual monitor that are being tracked by bug 934762.
Attachment #827066 - Flags: review?(jmathies)
Comment on attachment 827066 [details] [diff] [review] High DPI fix looks good!
Attachment #827066 - Flags: review?(jmathies) → review+
Status: REOPENED → RESOLVED
Closed: 11 years ago11 years ago
Resolution: --- → FIXED
Target Milestone: --- → Firefox 28
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Created:
Updated:
Size: