Closed Bug 412486 Opened 16 years ago Closed 16 years ago

widget/event-detection support for multi-touch trackpad gestures

Categories

(Core :: Widget: Cocoa, enhancement)

1.9.1 Branch
All
macOS
enhancement
Not set
normal

Tracking

()

VERIFIED FIXED
mozilla1.9.1b1

People

(Reporter: gkw, Assigned: tom.dyas)

References

(Depends on 2 open bugs, Blocks 1 open bug)

Details

(Whiteboard: [app behaviors should be filed against those products and depend on this bug])

Attachments

(2 files, 17 obsolete files)

1.96 KB, text/plain
Details
39.43 KB, patch
smaug
: review+
jaas
: review+
Details | Diff | Splinter Review
It would be nice to have some form of gesture support on the multi-touch pad while using Firefox when the MacBook Air ships.

For example, in Safari a three-finger swip will go Back and Forward.
Assignee: nobody → joshmoz
Component: General → Widget: Cocoa
Product: Firefox → Core
QA Contact: general → cocoa
The macbook air is shipping and it's rumored that the refresh of the mbp will have the improved multi-touch touchpad as well. This seems like something we'd want to have supported in Firefox 3. Is it simply too late for that?
Flags: blocking1.9?
Too late; definitely wouldn't block on it in any case.
Flags: blocking1.9? → blocking1.9-
Now the new MacBook Pros have the multi-touch touchpad.

Changing summary to reflect this by being more generic.
Summary: multi-touch pad support on MacBook Air → multi-touch pad support
This is requesting mapping specific input events to high-level browser actions, akin to assigning command-key shortcuts to menus. Given that, doesn't this belong in Firefox, not Core, since it would be up to each front-end to decide what actions to map gestures to?
In order for front-ends to decide what they want to map to we need widget support. Let's leave this bug Widget:Cocoa for that support and we can make other bugs for app-specific usage.
Bug 426739 implemented this for Camino.
I have read several Hendrix comments such as "Firefox on the Mac should really have support for the new multi-touch pads present in the MacBook Pro and MacBook Air.  for someone used to Safari, it makes it more difficult to use Firefox with these features missing." Would be good to get this is one of the dot releases.
 
Flags: wanted1.9.0.x?
We haven't even decided how we should make use of multitouch support in Firefox. This bug is about supporting it in our platform.
Flags: wanted1.9.0.x?
Flags: wanted1.9.0.x-
Flags: wanted-next+
The following patch is a hack but it does work.  It injects Cmd-Left when the user swipes left and Cmd-Right when the user swipes right.  Is there a better way to do this?
Ignore last patch.  I saw that there is a better patch in Camino.  Why no support in 3.1a1?
There are two separate issues:
- Creating an event that XUL front-ends can detect and handle as they see fit--which means it really needs to be a new event, not an existing event, otherwise individual apps lose all control over the behavior.
- Mapping that to something in Firefox.

Given that, I'd suggest that someone file a new bug blocked by this for Firefox as Josh suggested in comment 8, and that bugs requesting *Firefox* behavior be duped there.
Tweaking the summary and whiteboard to help clear up the confusion between backend and frontend changes.
Summary: multi-touch pad support → widget/event-detection support for multi-touch trackpad gestures
Whiteboard: [app behaviors should be filed against those products and depend on this bug]
I am new to the Mozilla code base so can someone confirm whether the following general outline seems viable:  The existing patch to nsChildView.mm should be modified to detect a swipe and then send a generic (to be defined) "gesture event."  My question is:  Where should this gesture event be defined?  I was thinking somewhere in dom/public/idl/events (like a new nsIDOMGestureEvent.idl).  However, this would extend the DOM.  Is that allowed?  I would then modify a file under browser/base/content to respond correctly to any "gesture event" raised by the widget code.
Adding a new dom event type is ok, but just make sure that you can't reuse any
existing event type. (In this case I think adding a gesture event is perhaps the
right solution)

Doug, would it make sense to have gesture event also in mobile environments?
What kind of attributes would be needed/useful there?
I'd like comments on the attached "take2" patch.  How do I wire up "gesture" events so that Javascript code in browser.js can see them?
Note:  The gesture event itself is not the final proposal.  It just contains the elements I need to learn how to wire up the event.  Comments on what it should contain for final patch are needed.
Attachment #333193 - Attachment mime type: application/octet-stream → text/plain
Comment on attachment 333193 [details]
Separate platform-specific from platform-independent code

You need to add the implementation of the new dom event (nsDOMGestureEvent.[h|cpp]) to content/events/src .
Also, to make it possible to create those events, nsEventDispatcher::CreateEvent must be modified
Attached file Swipe Support Take #3 (obsolete) —
Here is my third attempt.  The gesture event does not seem to get dispatched to the javascript side of things.  Is there any documentation for the event dispatching?
Attachment #332495 - Attachment is obsolete: true
Attachment #333193 - Attachment is obsolete: true
Comment on attachment 333659 [details]
Swipe Support Take #3

Looks pretty ok, few comments though
When generating patches, please use at least 8 line context (-U8)
and -p

>@@ -1364,6 +1378,8 @@
>     return sEventNames[eDOMEvents_cut];
>   case NS_PASTE:
>     return sEventNames[eDOMEvents_paste];
>+  case NS_GESTURE_EVENT:
>+    return sEventNames[eDOMEvents_gesture];
You should modify also ::SetEventType, not only ::GetEventType

>--- ./content/events/src/nsEventDispatcher.cpp.orig	2008-08-11 18:30:52.000000000 -0400
>+++ ./content/events/src/nsEventDispatcher.cpp	2008-08-11 18:44:23.000000000 -0400
>@@ -602,6 +602,10 @@
>     case NS_COMMAND_EVENT:
>       return NS_NewDOMCommandEvent(aDOMEvent, aPresContext,
>                                    static_cast<nsCommandEvent*>(aEvent));
>+
>+    case NS_GESTURE_EVENT:
>+      return NS_NewDOMGestureEvent(aDOMEvent, aPresContext,
>+                                   static_cast<nsGestureEvent*>(aEvent));
>     }
You should also add a way to create events using document.createEvent("GestureEvent");
so scroll down a bit in ::CreateEvent to see how it is done for other event types.

>+[scriptable, uuid(20AB8E74-BF9B-4D1D-8A18-B5EE0F3C0A36)]
>+interface nsIDOMGestureEvent : nsIDOMUIEvent
>+{
>+  readonly attribute boolean swipeLeft;
>+  readonly attribute boolean swipeRight;
>+  readonly attribute boolean swipeUp;
>+  readonly attribute boolean swipeDown;
>+
>+  void initGestureEvent(in DOMString typeArg,
>+                        in boolean canBubbleArg,
>+                        in boolean cancelableArg,
>+                        in boolean swipeLeftArg,
>+                        in boolean swipeRightArg,
>+                        in boolean swipeUpArg,
>+                        in boolean swipeDownArg);
If you really want to extend nsIDOMUIEvent, then initGestureEvent method should take
all the same parameters as initUIEvent plus your new parameters

>+++ ./dom/public/nsDOMClassInfoID.h	2008-08-10 23:23:43.000000000 -0400
>@@ -85,6 +85,7 @@
>   eDOMClassInfo_MouseEvent_id,
>   eDOMClassInfo_KeyboardEvent_id,
>   eDOMClassInfo_PopupBlockedEvent_id,
>+  eDOMClassInfo_GestureEvent_id,
It is recommended to add to the end of this list.

>+++ ./widget/public/nsGUIEvent.h	2008-08-10 23:56:46.000000000 -0400
>@@ -108,6 +108,7 @@
> #ifdef MOZ_MEDIA
> #define NS_MEDIA_EVENT                    34
> #endif // MOZ_MEDIA
>+#define NS_GESTURE_EVENT                  35
You're adding a new event struct type, but not actually a new event type, which is why nsDOMEvent::GetEventType 
fails. (I think - I'm just quickly browsing through this patch)
Assigning this to Tom since he's generating patches....
Assignee: joshmoz → tdyas
Attached file Swipe Support Take #4 (obsolete) —
Here is take #4.  I still am trying to figure out how to get the events sent to JavaScript.  Where should I be looking?
Attachment #333659 - Attachment is obsolete: true
Comment on attachment 333807 [details]
Swipe Support Take #4

>@@ -1417,16 +1439,18 @@ const char* nsDOMEvent::GetEventName(PRU
...
>+  case NS_GESTURE_EVENT:
>+    return sEventNames[eDOMEvents_gesture];
>   default:
This is wrong. There isn't any event, which "message" is  NS_GESTURE_EVENT.

>+++ ./dom/public/coreEvents/nsIDOMGestureListener.h	2008-08-13 23:19:47.000000000 -0400
Don't add this. Pretty mych all the interface in this directory are obsolete
https://wiki.mozilla.org/Gecko:Obsolete_API

>+++ ./dom/public/idl/events/nsIDOMGestureEvent.idl	2008-08-13 21:45:43.000000000 -0400
...
>+[scriptable, uuid(20AB8E74-BF9B-4D1D-8A18-B5EE0F3C0A36)]
>+interface nsIDOMGestureEvent : nsIDOMUIEvent
>+{
>+  readonly attribute boolean swipeLeft;
>+  readonly attribute boolean swipeRight;
>+  readonly attribute boolean swipeUp;
>+  readonly attribute boolean swipeDown;
Btw, is there a reason why these are booleans and not longs or floats? OSX dispatched those 
values as floats, (or maybe longs, I don't have a Mac to test) right?

>+class nsGestureEvent : public nsGUIEvent
>+{
>+public:
>+  nsGestureEvent(PRBool isTrusted, nsIWidget* w,
>+                 PRBool swipeLeftArg, PRBool swipeRightArg,
>+                 PRBool swipeUpArg, PRBool swipeDownArg)
>+    : nsGUIEvent(isTrusted, NS_GESTURE, w, NS_GESTURE_EVENT),
>+      swipeLeft(swipeLeftArg), swipeRight(swipeRightArg),
>+      swipeUp(swipeUpArg), swipeDown(swipeDownArg)
>+  {
>+  }
>+
>+  PRBool swipeLeft;
>+  PRBool swipeRight;
>+  PRBool swipeUp;
>+  PRBool swipeDown;
Same thing here. Why booleans?
1.  Ah got it.  The "NS_GESTURE" message I added should be referred to there instead of NS_GESTURE_EVENT.

2.  I'll remove nsIDOMGestureListener.  I was trying to see if an interface similar to the other listeners was necessary to dispatch events to the JavaScript side.  I guess not. 

3.  nsIDOMGestureEvent.idl:  Yes, OS X returns floats called "deltaX" and "deltaY."  The bools are a temporary hack while I wire up the general notion of a gesture event.  Once I have learned how to do that, I plan on coming up with a better representation for gestures in general so that the "pinch" and "rotate" gestures are represented.

4.  nsGestureEvent:  Same.

I ran firefox 3.1a1 with this patch under gdb with "./firefox --debug".  After setting a breakpoint in nsChildView.mm in the swipeWithEvent method, I stepped ahead and seem to end up in nsView and nsViewManager.  The event does not seem to be getting from there to the JavaScript side.  Am I hooking the code in browser.js up to the wrong JavaScript objects?
The event should go from view/viewmanager to presshell and then to nsEventStateManager and nsEventDispatcher. nsEventDispatcher is the one which
actually dispatches the event to DOM. If you add a listener to the topmost 
|window| and use capturing listener, it should catch the event.
Do you set the right widget for the event?
Success!  The key was to update the table in nsContentUtils::InitializeEventTable and also to call the helper method in nsChildView.mm that fills in the "widget" member of nsGestureEvent.  (I also modified the patch so nsGestureEvent derives from nsInputEvent like the mouse and key event classes.)  I can now swipe backward and forward to my heart's content.

The next step is to determine what the gesture events should look like as opposed to the current hack that only handles three-fingered swipes (and not pinch or rotate gestures).  Who are the stakeholders for that discussion?
Attachment #333807 - Attachment is obsolete: true
Well, you could remove the extra printfs and #if 0 stuff from the patch and 
split it to two parts.

First part could be the actual event dispatch. I don't know what kind of
information you can get from OSX's NSEvent, but probably that all should
go to the geckoevent/DOMEvent. So as I said before, change those booleans
in the interface to floats or something. If the same attributes in the 
interface could be used for other gesture event types (rotation and such) too,
then great (no need to add yet some more attributes) - but that depends on the 
information you can get from NSEvent.
This is all has dependencies in OSX's event handling (where Josh could perhaps
help) and Gecko's event handling where I can help if any help is still needed.

And then the second part would be adding something to the UI to handle the 
events. (/browser/base/content/browser.js part of the current patch)
For that you probably want comments from Firefox UI developers (mconnor etc)
I'll definitely clean up the patch along the lines you suggested.

Regarding the event dispatch, OS X does provide float values with the swipe event called "deltaX" and "deltaY," but these values are only ever -1.0, 0.0, or 1.0.  In the interest of platform independence, I think it makes sense to just have a "direction" value that will be one of up, down, left, or right plus the 4 diagonals (in case some system ever supports them).  Including the floats in the DOMEvent does not add anything useful, IMHO.

I was also thinking about the naming of the events.  Since these events are generated by trackpad gestures and not by a true multi-touch display, I think it makes sense to future proof the names so they are differentiated from any multi-touch support mozilla may gain in the future.  To that end, I am going to name the swipe gesture event, onTrackpadSwipeGesture.  The other two events will be named in a similar manner (onTrackpadMagnifyGesture and onTrackpadRotateGesture).  (If you can think of better names, let me know!)

I still need to play with the magnify and rotate gestures because it may or may not make sense to have these events be similar to the mousedown and mouseup events by having a start and stop event to allow for some level of interactive zooming and/or rotation.

Comments?
Attachment #334143 - Attachment is patch: true
Here is my proposed patch for the platform independent side of the trackpad gestures support.  I added additional attributes to the trackpad gesture event for the modifier keys that may have been pressed when the gesture occurred.  I also added support for the magnify (pinch) and rotate gestures.

The patch coalesces the multiple zoom and rotation events that are generated during a gesture into a single event to the DOM.  Does it makes sense to generate a "start" event, "update" event, and "end" event for such events so the application can provide some feedback during a gesture?  (Similar in spirit to "mousedown" and "mouseup".)
Attachment #334143 - Attachment is obsolete: true
Attachment #334622 - Attachment is patch: true
Comment on attachment 334622 [details] [diff] [review]
Trackpad Gestures Support - Take #1

>+++ ./dom/public/idl/events/nsIDOMTrackpadGestureEvent.idl	2008-08-19 15:37:42.000000000 -0400
>@@ -0,0 +1,88 @@
>+/* -*- Mode: IDL; tab-width: 2; indent-tabs-mode: nil; c-basic-offset: 2 -*- */
>+/* ***** BEGIN LICENSE BLOCK *****
>+ * Version: MPL 1.1/GPL 2.0/LGPL 2.1
>+ *
>+ * The contents of this file are subject to the Mozilla Public License Version
>+ * 1.1 (the "License"); you may not use this file except in compliance with
>+ * the License. You may obtain a copy of the License at
>+ * http://www.mozilla.org/MPL/
>+ *
>+ * Software distributed under the License is distributed on an "AS IS" basis,
>+ * WITHOUT WARRANTY OF ANY KIND, either express or implied. See the License
>+ * for the specific language governing rights and limitations under the
>+ * License.
>+ *
>+ * The Initial Developer is:
>+ *   Thomas K. Dyas <tdyas@zecador.org>
You're missing some parts from the license (also in other files).
see http://www.mozilla.org/MPL/boilerplate-1.1/mpl-tri-license-c

>+interface nsIDOMTrackpadGestureEvent : nsIDOMUIEvent
Why not just nsIDOMGestureEvent. Some other devices may generate similar events using some
other method than trackpad. You could remove trackpad also elsewhere.

>+{
>+  /* Directions. Swipes may be any of these.  Rotation will be either
>+   * LEFT or RIGHT (i.e., counter-clockwise and clockwise).
>+   */
>+  const unsigned long DOM_TG_DIRECTION_UP = 1;
>+  const unsigned long DOM_TG_DIRECTION_DOWN = 2;
>+  const unsigned long DOM_TG_DIRECTION_LEFT = 3;
>+  const unsigned long DOM_TG_DIRECTION_RIGHT = 4;
>+  const unsigned long DOM_TG_DIRECTION_UP_AND_LEFT = 5;
>+  const unsigned long DOM_TG_DIRECTION_UP_AND_RIGHT = 6;
>+  const unsigned long DOM_TG_DIRECTION_DOWN_AND_LEFT = 7;
>+  const unsigned long DOM_TG_DIRECTION_DOWN_AND_RIGHT = 8;
Maybe you could use bits as values and then UP_AND_LEFT, UP_AND_RIGHT etc. could be OR'ed.
  const unsigned long DOM_TG_DIRECTION_UP = 1;
  const unsigned long DOM_TG_DIRECTION_DOWN = 2;
  const unsigned long DOM_TG_DIRECTION_LEFT = 4;
  const unsigned long DOM_TG_DIRECTION_RIGHT = 8;



>+- (void)endGestureWithEvent:(NSEvent *)anEvent
...
>+  if (mTrackpadGestureIsRotate) {
>+    nsAutoRetainCocoaObject kungFuDeathGrip(self);
...
>+  }
>+
>+  mTrackpadGestureIsMagnify = NO;
>+  mTrackpadGestureIsRotate = NO;
>+  mTrackpadGestureDeltaZ = 0.0;
>+  mTrackpadGestureRotation = 0.0;
The object is possibly dead here and you're setting its member variables...
Move kungFuDeathGrip out from |if|

I think having some kinds of start/update/end events might be useful.
So maybe simple startGesture then the actual gesture events (swipe/rotate/etc) and finally endGesture ?
I included "trackpad" in the event names because I was concerned that these new events might interfere with any names needed to support true multi-touch interfaces.  I don't want to lock in any specific notion of a general gesture at this early stage.  On the other hand, true multi-touch interfaces will probably allow "magnify" and "rotate" gestures similar to the gestures supported by Mac OS X so those events may end up being the same.  I'll need to think more about this.

Also, I need to check whether the trackpad gestures include information on the mouse's location.  If so, it may be useful to include that information in the event.

As for start/update/end events, I will include events along those lines for magnify and rotate (Mac OS X does not send start/end for swipes).  I will also keep the current patch's roll-up of such events into a single event (similar to converting mousedown and mouseup into a click event) for applications that don't need to interactively do a zoom or rotate.
If you are concerned about the interface name, perhaps it could be
nsIDOMSimpleGestureEvent. That at least doesn't have a name which bounds it to
some specific type of device.

Perhaps you could also rename the events to something like DOMRotate, 
DOMGesture, DOMMagnify. That would be more in line with DOMMouseScroll, DOMMousePixelScroll, DOMContentLoaded and other non-standard DOM events in gecko. (I know, it is a bit bad habit to have the DOM- prefix, but that is how 
things have been for a long time)
The simple gesture, magnify and rotate names could be reserved for some 
specification.

If and when there is a need for multi-touch and also multimodal events
(these are possibly pretty similar evets, multimodal is just more generic),
new kinds of interfaces will be needed. And handling those events may
in fact bring in multimodal integrator or some such, which integrates event
streams - otherwise handling all the events manually may become too difficult.
But this all is future...having support even for simpler gestures would be great.
I revised the patch along the lines you suggested.  I also renamed the "magnitude" attribute to "delta" since magnitudes are usually positive and the attribute in question can be positive or negative.

The patch should be ready for inclusion in the tree, unless somebody believes wider distribution for comments is warranted first.
Attachment #334622 - Attachment is obsolete: true
(In reply to comment #37)
> The patch should be ready for inclusion in the tree, unless somebody believes
> wider distribution for comments is warranted first.
You need to ask and get reviews first. And write tests (mochitests).
Who should I contact to obtain those reviews?

I see the dom/tests/mochitest directory, but didn't see any tests of mouse or key events after doing a "find . -type f | xargs grep addEventListener".  Can you point me to where those tests would be?
I can review, but some else should superreview (maybe roc).
Add the review?<reviewer> flag to the attachment you want to be reviewed.

I think you need to add some method to nsIDOMWindowUtils, so
that you can dispatch right kinds of events from javascript -
that is needed in mochitests.
You can add the new tests to content/events/test

Tom, you can look at the patch for bug 378028 for an example of DOM event mochitests and the required methods in nsIDOMWindowUtils.

See http://developer.mozilla.org/en/Getting_your_patch_in_the_tree for the review process.
When looking for a reviewer, it can be helpful to look at the changelog of some of the files your patch affects, e.g. http://hg.mozilla.org/mozilla-central/index.cgi/log/384e2211e3b1/content/events/src/nsDOMEvent.cpp - look for r=* and sr=*.

Patches should be marked as "patch" in the attachment's details; it gives you the yellow background and, more importantly, the "Diff" link in the attachments table.
Attachment #335312 - Attachment is patch: true
Here is the next iteration of the patch. I added a basic mochitest.  A few issues before I submit for review:

1. The gestures interface on Mac OS X is currently a private interface.  (The header files don't define the methods called by Cocoa.)  Apple could conceivably change the interface in the future and the code in widget/src/cocoa would no longer function.  Should this patch wait until Apple finalizes the interface (whenever that happens, if ever)?  (AdiumX, a GAIM-based IM client, already uses this interface in version 1.3.)  It might depend on another Xcode release.

2.  Because the interface is currently private, there is no way for me to instantiate an NSEvent for the gesture events in order to inject those events into the code in nsChildView.mm.  Thus, the mochitest ended up being pretty basic.  See the funline.xul attachment for a separate test of the magnify and rotate events and dynamic updating.

3.  Finally, the units used for the magnify gesture are unspecified.  Further digging will be needed to determine what those units are.
Attachment #335312 - Attachment is obsolete: true
Attachment #335312 - Attachment description: Simple Gestures Support - Platform Independent Portion Take #2 → Simple Gestures Support Take #2
Attachment #334622 - Attachment description: Trackpad Gestures Support - Platform Independent Portion → Trackpad Gestures Support - Take #1
Attachment #335312 - Attachment description: Simple Gestures Support Take #2 → Simple Gestures Support - Take #2
(In reply to comment #42)
> 1. The gestures interface on Mac OS X is currently a private interface.  (The
> header files don't define the methods called by Cocoa.)  Apple could
> conceivably change the interface in the future and the code in widget/src/cocoa
> would no longer function.  Should this patch wait until Apple finalizes the
> interface (whenever that happens, if ever)?  (AdiumX, a GAIM-based IM client,
> already uses this interface in version 1.3.)  It might depend on another Xcode
> release.
Some mac developer should answer to this.

> 2.  Because the interface is currently private, there is no way for me to
> instantiate an NSEvent for the gesture events in order to inject those events
> into the code in nsChildView.mm.  Thus, the mochitest ended up being pretty
> basic.  See the funline.xul attachment for a separate test of the magnify and
> rotate events and dynamic updating.
But you can create gecko events (which nsDOMWindowUtils::SendSimpleGestureEvent does do), so that should be enough.
 
> 3.  Finally, the units used for the magnify gesture are unspecified.  Further
> digging will be needed to determine what those units are.
Very unfortunate if Apple hasn't documented things properly :/
(In reply to comment #44)
> Some mac developer should answer to this.

The most likely problem to result here would be that gestures would simply stop working in a future major update; it's very unlikely that the methods would change in a way that would make the current implementation dangerous (especially since Apple is undoubtedly aware that it's been fairly widely reverse-engineered and used). So the question is primarily whether a short-term regression in a future OS update would be acceptable from a product standpoint.

> Very unfortunate if Apple hasn't documented things properly :/

Erm... why would Apple have documented a private API that they didn't want anyone using yet and was reverse-engineered by an outside developer?


Tom, have you done any tests to figure out the event propagation model for these events? Will the presence of these methods in ChildView prevent enclosing responders (like the window or window controller) from having those method called? If so, I'm curious how an embedder would get and handle these events; is there a straightforward way to get Gecko-level events at the embedding layer?
(In reply to comment #45)
> Tom, have you done any tests to figure out the event propagation model for
> these events? Will the presence of these methods in ChildView prevent enclosing
> responders (like the window or window controller) from having those method
> called? If so, I'm curious how an embedder would get and handle these events;
> is there a straightforward way to get Gecko-level events at the embedding
> layer?
Couldn't embedder use nsIDocShell::chromeEventHandler? If it is set, events
from the |window| propagates to it. Or if not that, nsIDOMWindow2::windowRoot?
Rebase the patch so it builds with the 3.1a2 build candidate.
Attachment #336806 - Attachment is obsolete: true
In case anyone wants to have gestures support in Firefox.
(In reply to comment #45)
> The most likely problem to result here would be that gestures would simply stop
> working in a future major update; it's very unlikely that the methods would
> change in a way that would make the current implementation dangerous
> (especially since Apple is undoubtedly aware that it's been fairly widely
> reverse-engineered and used). So the question is primarily whether a short-term
> regression in a future OS update would be acceptable from a product standpoint.

Someone may need to make a call on this point then.  Looking at an overview of the iPhone SDK, there is a "Cocoa Touch" layer supporting the iPhone's touchscreen gestures.  Unclear if Apple intends to eventually adapt that SDK to regular Mac's (which it could if it ever makes a touchscreen Macbook) and whether it would supplement or replace support for trackpad gestures.

> Tom, have you done any tests to figure out the event propagation model for
> these events? Will the presence of these methods in ChildView prevent enclosing
> responders (like the window or window controller) from having those method
> called? If so, I'm curious how an embedder would get and handle these events;
> is there a straightforward way to get Gecko-level events at the embedding
> layer?

I haven't.  While I've done some Linux kernel hacking in the past, this is my first foray into the Mozilla codebase and Mac programming as well.  I'll need to get a better understanding about how the various parts interact (especially at the embedding level).

Side note:  I noticed that the code in nsChildView.mm creates EventRecord's for plugins and the like.  I created some dummy EventRecords (using "nullEvent") (which could probably come out).  Is there any thing that needs to be done for plugins?
(In reply to comment #45)
> Tom, have you done any tests to figure out the event propagation model for
> these events? Will the presence of these methods in ChildView prevent enclosing
> responders (like the window or window controller) from having those method
> called? If so, I'm curious how an embedder would get and handle these events;
> is there a straightforward way to get Gecko-level events at the embedding
> layer?

The gesture methods have the same prototype as the prototype used by the other NSResponder methods (return type: void and one NSEvent parameter).  My educated guess is that the default implementation for each gesture method just passes the gesture on to the next responder in the responder chain (like the other NSResponder methods).  The override of these methods in nsChildView.mm would consequently prevent the embedder from seeing the gestures in Cocoa (at least when the mouse is over the embedded browser instance). I haven't tested this belief, however.
Patch now applies cleanly to the actual 3.1a2 source tree. I removed from nsChildView.mm the creation of EventRecord's for plugins since I didn't find any Carbon-equivalent for the trackpad gestures.  I'll submit this patch for review once I figure out who should review it.
Attachment #336861 - Attachment is obsolete: true
Patch to allow Firefox to support gestures.  Swipes handled in a similar manner to Safari.
Attachment #336862 - Attachment is obsolete: true
Attachment #337392 - Flags: review?(stuart.morgan+bugzilla)
Attachment #337392 - Flags: review?(Olli.Pettay)
Comment on attachment 337392 [details] [diff] [review]
Simple Gestures Support - Take #5

Josh should do this review, since he knows a whole lot more about the event system (and since I don't even have a 1.9.1 tree).

As a drive-by though, you need
if (!mGeckoChild)
  return;
in every one of your methods that sends to mGeckoChild.
Attachment #337392 - Flags: review?(stuart.morgan+bugzilla) → review?(joshmoz)
Comment on attachment 337392 [details] [diff] [review]
Simple Gestures Support - Take #5

In nsChildView.mm methods have two newlines between methods.

+- (void)beginGestureWithEvent:(NSEvent *)anEvent
+{
+  NS_OBJC_BEGIN_TRY_ABORT_BLOCK;
+
+  mGestureState = eGestureState_StartGesture;
+  mLastDeltaZ = 0;
+  mCumulativeRotation = 0.0;
+
+  NS_OBJC_END_TRY_ABORT_BLOCK;
+}

No need for the NS_OBJC... exception handler macros here because you're not calling into any obj-c code.

+- (void)swipeWithEvent:(NSEvent *)anEvent

Might want to null check anEvent here and in other methods, especially since they are private.

+  (void) mGeckoChild->DispatchWindowEvent(geckoEvent);

Get rid of the "(void) " here, not necessary. Happens a few times.

-
 @interface ChildView(Private)

Don't kill that newline.
Attachment #337392 - Flags: review?(joshmoz)
Attachment #337392 - Flags: review?(Olli.Pettay)
Modified nsChildView.mm as requested.
Attachment #337392 - Attachment is obsolete: true
Attachment #337913 - Flags: review?(joshmoz)
Attachment #337913 - Flags: review?(Olli.Pettay)
Comment on attachment 337913 [details] [diff] [review]
Simple Gestures Support - Take #6

+  // Simple gestures support
+  enum {
+    eGestureState_None,
+    eGestureState_StartGesture,
+    eGestureState_MagnifyGesture,
+    eGestureState_RotateGesture
+  } mGestureState;

Please note in comments that these are reverse-engineered constants for private functionality Cocoa API. Same goes for the method declarations. Should be noted by the declarations and the implementation of the methods.

+  float mLastDeltaZ;
+  float mCumulativeRotation;

Best to explicitly initialize these (in "initWithFrame:geckoChild:") rather than assume they'll first be referenced from "beginGestureWithEvent:".

+ * Found prototypes for gesture events from:

"Got ... from" or "Found ... at", grammatical

Fix this stuff and then r+, thanks
Attachment #337913 - Flags: review?(Olli.Pettay) → review-
Comment on attachment 337913 [details] [diff] [review]
Simple Gestures Support - Take #6

>+    // Simple gesture events
>+   ,{ &nsGkAtoms::onDOMSwipeGesture,             { NS_SIMPLE_GESTURE_SWIPE, EventNameType_None } },
>+    { &nsGkAtoms::onDOMMagnifyGestureStart,      { NS_SIMPLE_GESTURE_MAGNIFY_START, EventNameType_None } },
>+    { &nsGkAtoms::onDOMMagnifyGestureUpdate,     { NS_SIMPLE_GESTURE_MAGNIFY_UPDATE, EventNameType_None } },
>+    { &nsGkAtoms::onDOMMagnifyGesture,           { NS_SIMPLE_GESTURE_MAGNIFY, EventNameType_None } },
>+    { &nsGkAtoms::onDOMRotateGestureStart,       { NS_SIMPLE_GESTURE_ROTATE_START, EventNameType_None } },
>+    { &nsGkAtoms::onDOMRotateGestureUpdate,      { NS_SIMPLE_GESTURE_ROTATE_UPDATE, EventNameType_None } },
>+    { &nsGkAtoms::onDOMRotateGesture,            { NS_SIMPLE_GESTURE_ROTATE, EventNameType_None } }
Sorry! This is a very recent change, but we want to have Moz prefix on Mozilla specific event names, not DOM anymore. We won't change the existing events, but new ones should use Moz.
So onMozSwipeGesture etc.

>+  ,
>+  "DOMSwipeGesture",
>+  "DOMMagnifyGestureStart",
>+  "DOMMagnifyGestureUpdate",
>+  "DOMMagnifyGesture",
>+  "DOMRotateGestureStart",
>+  "DOMRotateGestureUpdate",
>+  "DOMRotateGesture"
Same here s/DOM/Moz/ and also elsewhere.

>+NS_IMETHODIMP
>+nsDOMSimpleGestureEvent::GetAltKey(PRBool* aIsDown)
>+{
>+  NS_ENSURE_ARG_POINTER(aIsDown);
>+  *aIsDown = ((nsInputEvent*)mEvent)->isAlt;
Use static_cast<> here and elsewhere.

>+/**
>+ * The nsIDOMSimpleGestureEvent interface is the datatype for all
>+ * Mozila-specific simple gesture events in the Document Object Model.
Mozilla

>+ *
>+ * The following events are generated:
>+ *
>+ * DOMSwipeGesture - Generated when the user swipes their fingers
>+ * across the input device.
>+ *
>+ * DOMMagnifyGestureStart - Generated when the user begins the magnify
>+ * ("pinch") gesture.  The "delta" value represents the initial
>+ * movement.
>+ *
>+ * DOMMagnifyGestureUpdate - Generated periodically while the user is
>+ * continuing the magnify ("pinch") gesture.  The "delta" value
>+ * represents the movement since the last DOMMagnifyGestureStart or
>+ * DOMMagnifyGestureUpdate event.
>+ *
>+ * DOMMagnifyGesture - Generated when the user has completed the
>+ * magnify ("pinch") gesture.  The "delta" value is the cumulative
>+ * amount represented by the user's gesture.
>+
>+ * DOMRotateGestureStart - Generated when the user begins the rotation
>+ * gesture.  The "delta" value represents the initial rotation.
>+ *
>+ * DOMRotateGestureUpdate - Generated periodically while the user is
>+ * continuing the rotation gesture.  The "delta" value represents the
>+ * rotation since the last DOMRotateGestureStart or
>+ * DOMRotateGestureUpdate event.
>+ *
>+ * DOMRotateGesture - Generated when the user has completed the
>+ * rotation gesture.  The "delta" value is the cumulative amount of
>+ * rotation represented by the user's gesture.
>+ */
>+
>+[scriptable, uuid(20AB8E74-BF9B-4D1D-8A18-B5EE0F3C0A36)]
>+interface nsIDOMSimpleGestureEvent : nsIDOMUIEvent
Note, no need to change the name of this interface. Just the event names.

>+{
>+  /* Direction constants */
>+  const unsigned long DOM_SG_DIRECTION_UP = 1;
>+  const unsigned long DOM_SG_DIRECTION_DOWN = 2;
>+  const unsigned long DOM_SG_DIRECTION_LEFT = 4;
>+  const unsigned long DOM_SG_DIRECTION_RIGHT = 8;
Do you actually need DOM_SG_ prefix?

>+   * XXX - The units for measuring magnification are currently
>+   * unspecified because the units used by Mac OS X are currently
>+   * undocumented.
Perhaps you could still mention what are typical values. I mean, are they
something like 110 or 1.1 or what?

>+// Simple gesture events
>+#define NS_SIMPLE_GESTURE_EVENT_START    3400
Just FYI, there is another events patch somewhere in bugzilla which will use 3400, IIRC.
So either this patch or that one must be changed before pushing the changes to repository -
whichever gets pushed later.

Didn't review the OSX specific part.

If you could fix the comments and I'll re-read the patch. Generically looks good!
Status: NEW → ASSIGNED
(In reply to comment #57)

> >+{
> >+  /* Direction constants */
> >+  const unsigned long DOM_SG_DIRECTION_UP = 1;
> >+  const unsigned long DOM_SG_DIRECTION_DOWN = 2;
> >+  const unsigned long DOM_SG_DIRECTION_LEFT = 4;
> >+  const unsigned long DOM_SG_DIRECTION_RIGHT = 8;
> Do you actually need DOM_SG_ prefix?

Nope so I'll remove the prefix then. I was only trying to name them in a similar manner to the DOM_VK_ constants in nsIDOMKeyEvent.idl. The fact that the constants are contained within nsIDOMSimpleGestureEvent on the C++ side and within SimpleGestureEvent on the Javascript side will keep them from conflicting with any other constants.

> >+   * XXX - The units for measuring magnification are currently
> >+   * unspecified because the units used by Mac OS X are currently
> >+   * undocumented.
> Perhaps you could still mention what are typical values. I mean, are they
> something like 110 or 1.1 or what?

They range in value from 0 to 100 typically.  I believe they are based on what the trackpad returns and so may be similar to the raw X and Y values used for mouse events but I haven't spent tons of time researching the point.  My patch for firefox only looks at whether the value was positive or negative so there is currently no dependency on a specific value.

> >+// Simple gesture events
> >+#define NS_SIMPLE_GESTURE_EVENT_START    3400
> Just FYI, there is another events patch somewhere in bugzilla which will use
> 3400, IIRC.
> So either this patch or that one must be changed before pushing the changes to
> repository -
> whichever gets pushed later.

I'll stay aware of it.  I'll also rebase the patch against a nightly so it is easier to apply.

> Didn't review the OSX specific part.

Which is why I added joshmoz as a reviewer as well.  Once you both sign off on the next patch, I'll request a super-review from roc.
Attachment #337913 - Flags: review?(joshmoz)
Made the changes requested by both reviewers. The patch has been rebased against a checkout of the source tree from this morning.
Attachment #337913 - Attachment is obsolete: true
Attachment #338434 - Flags: superreview?
Attachment #338434 - Flags: review?(joshmoz)
Attachment #338434 - Flags: review?(Olli.Pettay)
Attachment #338434 - Flags: superreview?(roc)
Attachment #338434 - Flags: superreview?
Attachment #338434 - Flags: review?(Olli.Pettay)
Attachment #338434 - Flags: review+
Comment on attachment 338434 [details] [diff] [review]
Simple Gestures Support - Take #7

>+  // swipe gesture event
>+  helper1("MozSwipeGesture", SimpleGestureEvent.DIRECTION_LEFT, 0.0, 0);
>+  helper1("MozSwipeGesture", SimpleGestureEvent.DIRECTION_RIGHT, 0.0, 0);
>+  helper1("MozSwipeGesture", SimpleGestureEvent.DIRECTION_UP, 0.0, 0);
>+  helper1("MozSwipeGesture", SimpleGestureEvent.DIRECTION_DOWN, 0.0, 0);
>+
>+  // magnify gesture events
>+  helper1("MozMagnifyGestureStart", 0, 50.0, 0);
>+  helper1("MozMagnifyGestureUpdate", 0, -25.0, 0);
>+  helper1("MozMagnifyGestureUpdate", 0, 5.0, 0);
>+  helper1("MozMagnifyGesture", 0, 30.0, 0);
>+
>+  // rotate gesture events
>+  helper1("MozRotateGestureStart", SimpleGestureEvent.DIRECTION_RIGHT, 33.0, 0);
>+  helper1("MozRotateGestureUpdate", SimpleGestureEvent.DIRECTION_LEFT, -13.0, 0);
>+  helper1("MozRotateGestureUpdate", SimpleGestureEvent.DIRECTION_RIGHT, 13.0, 0);
>+  helper1("MozRotateGesture", SimpleGestureEvent.DIRECTION_RIGHT, 33.0, 0);
>+
>+  // event.shiftKey
>+  var modifier = Components.interfaces.nsIDOMNSEvent.SHIFT_MASK;
>+  helper1("MozSwipeGesture", SimpleGestureEvent.DIRECTION_RIGHT, 0, modifier);
>+
>+  // event.metaKey
>+  modifier = Components.interfaces.nsIDOMNSEvent.META_MASK;
>+  helper1("MozSwipeGesture", SimpleGestureEvent.DIRECTION_RIGHT, 0, modifier);
>+
>+  // event.altKey
>+  modifier = Components.interfaces.nsIDOMNSEvent.ALT_MASK;
>+  helper1("MozSwipeGesture", SimpleGestureEvent.DIRECTION_RIGHT, 0, modifier);
>+
>+  // event.ctrlKey
>+  modifier = Components.interfaces.nsIDOMNSEvent.CONTROL_MASK;
>+  helper1("MozSwipeGesture", SimpleGestureEvent.DIRECTION_RIGHT, 0, modifier);
>+
>+  // event creation
>+  helper2("MozMagnifyGesture", SimpleGestureEvent.DIRECTION_RIGHT, 20.0,
>+    true, false, true, false);
>+  helper2("MozMagnifyGesture", SimpleGestureEvent.DIRECTION_LEFT, -20.0,
>+    false, true, false, true);
Could you add some tests where you OR DIRECTION_XXX values.

r=me
This is cool.

Does Webkit expose these events to content?

I think that the nsDOMClassInfo additions are supposed to be added at the end of nsDOMClassInfoID etc.

Should MozMagnifyGesture and MozRotateGesture be called MozMagnifyGestureEnd/MozRotateGestureEnd?

My main concern here is exposing these events to content --- perhaps creating a new de facto standard --- based on reverse engineering the first iteration of one platform's API for this feature. I'd be much happier if these events were not exposed to Web content, only to chrome, at least until we have actual documentation and maybe a chance to talk to a wider audience about what the DOM API should look like.
(In reply to comment #61)
> I'd be much happier if these events were
> not exposed to Web content, only to chrome,
That can be done in chrome, if chrome adds listeners to capturing phase and
calls event.stopPropagation();
That's not very robust. Forcing every XUL app to do that seems like the wrong way to go.
We do have NS_EVENT_FLAG_NO_CONTENT_DISPATCH flag for events, but that predates
event dispatch rewrite and was never fully implemented.
The current implementation works like in 1.8.
(In reply to comment #61)
> My main concern here is exposing these events to content --- perhaps creating a
> new de facto standard --- based on reverse engineering the first iteration of
> one platform's API for this feature. I'd be much happier if these events were
> not exposed to Web content, only to chrome, at least until we have actual
> documentation and maybe a chance to talk to a wider audience about what the DOM
> API should look like.
Btw, then we should limit MozAfterPaint etc. to chrome only too.
(In reply to comment #63)
> That's not very robust. Forcing every XUL app to do that seems like the wrong
> way to go.
We don't have to force XUL apps to do that. It might be just recommendation to 
do it. And in practice if Firefox doesn't expose those events, I doubt they will 
be used too often.
One option is to fix this bug now, with .stopPropagation(), and file a followup bug to design and implement dispatch-to-chrome-and-chrome-handler-only.
I think we still want to proper event target chain with event retargeting and all, but just in a way where non-chrome event listeners aren't called and
default handlers in non-chrome aren't called.
(I could take such bug.)
(In reply to comment #61)

> Does Webkit expose these events to content?

WebKit exposes a different API to content where the content can track individual fingers touching the display.  See http://www.sitepen.com/blog/2008/07/10/touching-and-gesturing-on-the-iphone/ for specifics.  The problem is that this API only shows up on the iPhone and not Mac OS X because the iPhone has a documented API for receiving multi-touch events (Cocoa Touch) while Mac OS X currently only has the undocumented API for trackpad-based gestures that was reverse-engineered by another blogger at http://cocoadex.com/2008/02/nsevent-modifications-swipe-ro.html

There actually was some discussion earlier in this bug about designing the events for these gestures in a way that wouldn't interfere with a "real" API in the future. See comments 35 and 36.

> I think that the nsDOMClassInfo additions are supposed to be added at the end
> of nsDOMClassInfoID etc.

OK.

> Should MozMagnifyGesture and MozRotateGesture be called
> MozMagnifyGestureEnd/MozRotateGestureEnd?

I thought about doing that. However, MozMagnifyGesture and MozRotateGesture roll up all of the magnification or rotation, respectively, that occurred during a gesture and an application can just listen for those events if it doesn't want to support a dynamic response to a gesture as it happens.  Putting "End" as a suffix might lead someone believe that they must also hook the "Start" event, which is not the case.  That said, I'm not wedded to the current naming scheme if you think consistency requires an "End" event.

> My main concern here is exposing these events to content --- perhaps creating a
> new de facto standard --- based on reverse engineering the first iteration of
> one platform's API for this feature. I'd be much happier if these events were
> not exposed to Web content, only to chrome, at least until we have actual
> documentation and maybe a chance to talk to a wider audience about what the DOM
> API should look like.

Agreed. See comment #45.  One question for a wider audience is whether a short-term regression in the gesture handling is acceptable if Apple changes the API (whether by improving the current one or by bringing the full Cocoa Touch layer over to Mac OS X).
(In reply to comment #60)
> >+  // event creation
> >+  helper2("MozMagnifyGesture", SimpleGestureEvent.DIRECTION_RIGHT, 20.0,
> >+    true, false, true, false);
> >+  helper2("MozMagnifyGesture", SimpleGestureEvent.DIRECTION_LEFT, -20.0,
> >+    false, true, false, true);
> Could you add some tests where you OR DIRECTION_XXX values.
> 
> r=me

I can.  Do you want the test to just check whether the DIRECTION values can be or'ed together or whether such or'ed values can be passed through in an event?  If you mean the later case, I'm not sure what such a test would get us since the parameter in question is an unsigned long and any integral value could be passed through (not just or'ed direction constants).
(In reply to comment #68)
> I can.  Do you want the test to just check whether the DIRECTION values can be
> or'ed together or whether such or'ed values can be passed through in an event? 
> If you mean the later case, I'm not sure what such a test would get us since
> the parameter in question is an unsigned long and any integral value could be
> passed through (not just or'ed direction constants).
I mean both. Check that the value you use to dispatch the event is the same
you get from event.
(In reply to comment #65)
> Btw, then we should limit MozAfterPaint etc. to chrome only too.

I don't think that situation is similar at all. Notifying about paints is a very well constrained problem, and entirely under our control with no platform dependencies. The API we have there should be enough for all needs I can think of.

This gesture stuff is highly unconstrained --- we have no idea what future gestures might exist, but we're sure there will be more. The gestures and their properties are likely to vary across platforms. And we even don't know everything we'd like to know about the one platform API we currently have.
(In reply to comment #66)
> (In reply to comment #63)
> > That's not very robust. Forcing every XUL app to do that seems like the wrong
> > way to go.
> We don't have to force XUL apps to do that. It might be just recommendation to 
> do it. And in practice if Firefox doesn't expose those events, I doubt they
> will 
> be used too often.
> One option is to fix this bug now, with .stopPropagation(), and file a followup
> bug to design and implement dispatch-to-chrome-and-chrome-handler-only.
> I think we still want to proper event target chain with event retargeting and
> all, but just in a way where non-chrome event listeners aren't called and
> default handlers in non-chrome aren't called.
> (I could take such bug.)

That sounds fine.
(In reply to comment #67)
> WebKit exposes a different API to content where the content can track
> individual fingers touching the display.  See
> http://www.sitepen.com/blog/2008/07/10/touching-and-gesturing-on-the-iphone/
> for specifics.  The problem is that this API only shows up on the iPhone and
> not Mac OS X because the iPhone has a documented API for receiving multi-touch
> events (Cocoa Touch) while Mac OS X currently only has the undocumented API for
> trackpad-based gestures that was reverse-engineered by another blogger at
> http://cocoadex.com/2008/02/nsevent-modifications-swipe-ro.html

OK. Could that API coexist with this one? I guess so.

> > Should MozMagnifyGesture and MozRotateGesture be called
> > MozMagnifyGestureEnd/MozRotateGestureEnd?
> 
> I thought about doing that. However, MozMagnifyGesture and MozRotateGesture
> roll up all of the magnification or rotation, respectively, that occurred
> during a gesture and an application can just listen for those events if it
> doesn't want to support a dynamic response to a gesture as it happens.  Putting
> "End" as a suffix might lead someone believe that they must also hook the
> "Start" event, which is not the case.  That said, I'm not wedded to the current
> naming scheme if you think consistency requires an "End" event.

That makes sense. Check that that's documented somewhere, I missed it.

> > My main concern here is exposing these events to content --- perhaps creating a
> > new de facto standard --- based on reverse engineering the first iteration of
> > one platform's API for this feature. I'd be much happier if these events were
> > not exposed to Web content, only to chrome, at least until we have actual
> > documentation and maybe a chance to talk to a wider audience about what the DOM
> > API should look like.
> 
> Agreed. See comment #45.  One question for a wider audience is whether a
> short-term regression in the gesture handling is acceptable if Apple changes
> the API (whether by improving the current one or by bringing the full Cocoa
> Touch layer over to Mac OS X).

I'm actually not worried about that so much. I'm more worried about having to carry support for several multitouch/gesture DOM APIs forever. Preventing Web content from receiving these events (for now) mitigates that.
(In reply to comment #72)
> (In reply to comment #67)
> > WebKit exposes a different API to content where the content can track
> > individual fingers touching the display.  See
> > http://www.sitepen.com/blog/2008/07/10/touching-and-gesturing-on-the-iphone/
> > for specifics.  The problem is that this API only shows up on the iPhone and
> > not Mac OS X because the iPhone has a documented API for receiving multi-touch
> > events (Cocoa Touch) while Mac OS X currently only has the undocumented API for
> > trackpad-based gestures that was reverse-engineered by another blogger at
> > http://cocoadex.com/2008/02/nsevent-modifications-swipe-ro.html
> 
> OK. Could that API coexist with this one? I guess so.

I believe so.  The APIs handle two different types of interaction.  The first type is the iPhone model where the user is literally touching the screen and interacting with the widgets using their fingers.  The second type is the model on the current Macbook's and the Macbook Air where the user makes multi-touch gestures on the trackpad and not the screen.

Side note:  I had originally included "Trackpad" in the name of the events to make this dependency explicit. Olli suggested not including the word "Trackpad" in the name so as not to be dependent on a particular type of input device,  which I agree with.  However, whatever happens in the multi-touch arena, I think there will have to be some conceptual separation between touchscreen interaction and trackpad interaction.

> 
> > > Should MozMagnifyGesture and MozRotateGesture be called
> > > MozMagnifyGestureEnd/MozRotateGestureEnd?
> > 
> > I thought about doing that. However, MozMagnifyGesture and MozRotateGesture
> > roll up all of the magnification or rotation, respectively, that occurred
> > during a gesture and an application can just listen for those events if it
> > doesn't want to support a dynamic response to a gesture as it happens.  Putting
> > "End" as a suffix might lead someone believe that they must also hook the
> > "Start" event, which is not the case.  That said, I'm not wedded to the current
> > naming scheme if you think consistency requires an "End" event.
> 
> That makes sense. Check that that's documented somewhere, I missed it.

It isn't.  I'll add some language to nsIDOMSimpleGestureEvent.idl.

> > Agreed. See comment #45.  One question for a wider audience is whether a
> > short-term regression in the gesture handling is acceptable if Apple changes
> > the API (whether by improving the current one or by bringing the full Cocoa
> > Touch layer over to Mac OS X).
> 
> I'm actually not worried about that so much. I'm more worried about having to
> carry support for several multitouch/gesture DOM APIs forever. Preventing Web
> content from receiving these events (for now) mitigates that.

Got it. Should this bug be blocked then until that support is added?
No. Let's go ahead with this but make sure that Firefox chrome blocks the events from reaching Web content.
(In reply to comment #69)
> (In reply to comment #68)
> > I can.  Do you want the test to just check whether the DIRECTION values can be
> > or'ed together or whether such or'ed values can be passed through in an event? 
> > If you mean the later case, I'm not sure what such a test would get us since
> > the parameter in question is an unsigned long and any integral value could be
> > passed through (not just or'ed direction constants).
> I mean both. Check that the value you use to dispatch the event is the same
> you get from event.

Done.  You'll see the tests in the next iteration of the patch.  (The mochitest already checks all of the fields to see if everything passes through correctly.)
Made requested changes.  Rebased patch against current pull from tree.  Mochitest passes.  However, it will fail once my patch to the Firefox chrome is in place because the chrome will prevent the mochitest from seeing the gesture events.  Any recommended solution (preference or privilege check?)?
Attachment #338434 - Attachment is obsolete: true
Attachment #339529 - Flags: review?(joshmoz)
Attachment #339529 - Flags: review?(Olli.Pettay)
Attachment #338434 - Flags: superreview?(roc)
Attachment #338434 - Flags: review?(joshmoz)
The patch for Firefox chrome.  I'll submit this patch once the base patch eventually makes it into the tree.
Attachment #337394 - Attachment is obsolete: true
(In reply to comment #76)
> However, it will fail once my patch to the Firefox chrome is
> in place because the chrome will prevent the mochitest from seeing the gesture
> events.
Ah, indeed. Does it work if you open a new window using window.openDialog
and run the test there? You need "UniversalBrowserWrite" privileges for that.
Or if that doesn't work, maybe a browser-chrome test?
(In reply to comment #78)
> (In reply to comment #76)
> > However, it will fail once my patch to the Firefox chrome is
> > in place because the chrome will prevent the mochitest from seeing the gesture
> > events.
> Ah, indeed. Does it work if you open a new window using window.openDialog
> and run the test there? You need "UniversalBrowserWrite" privileges for that.
> Or if that doesn't work, maybe a browser-chrome test?

The problem is that my chrome code uses a capturing phase listener on the toplevel element that then calls stopPropagation().  The events are stopped in their tracks before the mochitest listener (or any other listeners for that mater) can see the events.  I can add in an exception that prevents stopPropagation from being called (or that disables my chrome code entirely during the test), but need suggestions as to how to do it with minimal impact.  Do I do a privilege check of some sort or check for a preference?
Tom,

Checkout browser chrome tests (http://developer.mozilla.org/en/Browser_chrome_tests). They should run in chrome with chrome privileges.

Some example browser chrome drag tests can be found in this patch:
https://bugzilla.mozilla.org/attachment.cgi?id=339513&action=edit

and probably other places in the tree.
Browser chrome tests can access the browser so you can temporarily remove those events listeners, I think.
Comment on attachment 339529 [details] [diff] [review]
Simple Gestures Support - Take #8

If you could change the test to browser-test and add the things in nsDOMClassInfo/nsDOMClassInfoID to the end of lists, r=me
Attachment #339529 - Flags: review?(Olli.Pettay) → review+
Moved all tests to browser-chrome.  See Firefox-specific patch for tests.  Corrected the DOMClassInfo stuff as requested.  Fixed a bug in the magnify gesture handling uncovered while testing the Firefox zoom gesture handling.
Attachment #339529 - Attachment is obsolete: true
Attachment #339910 - Flags: review?(joshmoz)
Attachment #339910 - Flags: review?(Olli.Pettay)
Attachment #339529 - Flags: review?(joshmoz)
All regression tests moved into this patch.
Attachment #339530 - Attachment is obsolete: true
Comment on attachment 339911 [details] [diff] [review]
Firefox-Specific Gestures Support v4

>+var gGestureSupport = {
>+  init : function() {

>+    gBrowser.addEventListener("MozMagnifyGestureStart", this.onMagnifyGestureStart, true);

>+  _initiallyZoomedOut : false,
>+  _isZoomed : false,
>+
>+  onMagnifyGestureStart : function (evt) {
>+    evt.stopPropagation();
>+
>+    if (evt.delta > 0.0) {
>+      FullZoom.enlarge();
>+      this._initiallyZoomedOut = true;

"this" is not what you expect here.

Why do you add the listeners to gBrowser instead of window?
Blocks: 456520
Opened bug 456520 with Firefox-specific patch.
(In reply to comment #85)
> (From update of attachment 339911 [details] [diff] [review])
> >+  _initiallyZoomedOut : false,
> >+  _isZoomed : false,
> >+
> >+  onMagnifyGestureStart : function (evt) {
> >+    evt.stopPropagation();
> >+
> >+    if (evt.delta > 0.0) {
> >+      FullZoom.enlarge();
> >+      this._initiallyZoomedOut = true;
> 
> "this" is not what you expect here.

I wanted to have a single global variable (gGestureSupport) with all of the needed variables/methods under that.  I saw this type of usage in gPopupBlockerObserver.  Is there a better way?
 
> Why do you add the listeners to gBrowser instead of window?

Just my unfamiliarity with the Firefox code.  I revised the patch to refer to |window| and it appears to still work.

I'll post a revised patch over in bug 456520 (and obsolete the Firefox-specific patch in this bug).
Attachment #339911 - Attachment is obsolete: true
Comment on attachment 339910 [details] [diff] [review]
Simple Gestures Support - Take #9

>+NS_IMETHODIMP
>+nsDOMSimpleGestureEvent::InitSimpleGestureEvent(const nsAString & typeArg, PRBool canBubbleArg, PRBool cancelableArg, nsIDOMAbstractView *viewArg, PRInt32 detailArg, PRUint32 directionArg, PRFloat64 deltaArg, PRBool altKeyArg, PRBool ctrlKeyArg, PRBool shiftKeyArg, PRBool metaKeyArg)
Too long line. Can be fixed when commiting.
Attachment #339910 - Flags: review?(Olli.Pettay) → review+
Attachment #339910 - Flags: superreview?(roc)
Attachment #339910 - Flags: superreview?(roc) → superreview+
(In reply to comment #88)
> (From update of attachment 339910 [details] [diff] [review])
> >+NS_IMETHODIMP
> >+nsDOMSimpleGestureEvent::InitSimpleGestureEvent(const nsAString & typeArg, PRBool canBubbleArg, PRBool cancelableArg, nsIDOMAbstractView *viewArg, PRInt32 detailArg, PRUint32 directionArg, PRFloat64 deltaArg, PRBool altKeyArg, PRBool ctrlKeyArg, PRBool shiftKeyArg, PRBool metaKeyArg)
> Too long line. Can be fixed when commiting.

Understood.  What is the process for getting the patch landed in the tree once it has r+ and sr+?
I can check in the patch after josh has reviewed it. Though, I'm not sure about
beta2 rules, but I'll find that out.
Does this patch need to wait until the Firefox-specific patch is reviewed? The Firefox patch is what keeps the gesture events away from web content currently. So they should probably be landed together.
Right. I think they should land in the same time.
Attachment #339910 - Flags: review?(joshmoz) → review+
I got a question on my blog asking about Multi-touch trackpad macbooks running windows on boot camp:

"Is it possible to make this work for Windows users that use Boot Camp? I ask because I have a MBP and I’d like to use this for both the Mac and Windows partitions of my laptop.

Apple already has drivers that enable capability for the trackpad on Windows, so given those drivers, it seems to me that it wouldn’t be impossible to make a windows version of FireFox with Multi-Touch capability."

I suppose if Apple provides the windows drivers to make multitouch gestures available, it should be doable but probably through a completely different interface.

Additionally I wonder what would happen with Parallels..
That depends on if Windows gives gesture events to widget layer.
If it does that, and someone has links to documentation about that, the
support could be added, possibly pretty easily.
One option is to break out of the Firefox-specific patch the part that prevents events from reaching content, move it back to this bug so this can land while the UI wizards figure out how to use it in bug 456520.
I'll push the patch once the tree is a bit greener.
author	tdyas@zecador.org
	Thu Oct 23 23:15:20 2008 +0300 (at Thu Oct 23 23:15:20 2008 +0300)
changeset 20793	f39ecb5b05b7
parent 20792	c79936775c2c
child 20794	b7bd85f1dd75
Bug 412486 - widget/event-detection support for multi-touch trackpad gestures, r=smaug,josh, sr=roc
Status: ASSIGNED → RESOLVED
Closed: 16 years ago
Resolution: --- → FIXED
Depends on: 463168
Verified with Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.5; en-US; rv:1.9.1b3pre) Gecko/20090214 Shiretoko/3.1b3pre Ubiquity/0.1.5 ID:20090214020701

Can we get basic testing for our test harness?
Status: RESOLVED → VERIFIED
Flags: in-testsuite?
Hardware: x86 → All
Target Milestone: --- → mozilla1.9.1b1
Version: unspecified → 1.9.1 Branch
Tests were added in bug 456520.
Flags: in-testsuite? → in-testsuite+
No longer blocks: 478604
Blocks: 479901
I'm looking into bugs 868648 and 868659 and this bug-patch seems like a good starting point for adding the wanted behavior. Then adding a listener in the window to display/hide the scrollbars on gesturebegin/gestureend events. Any thoughts on that?
Flags: needinfo?(joshmoz)
Flags: needinfo?(joshmoz) → needinfo?(mstange)
I've answered in bug 868648.
Flags: needinfo?(mstange)
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Creator:
Created:
Updated:
Size: