Support for a testing input driver

RESOLVED FIXED in B2G C1 (to 19nov)

Status

()

Core
Widget: Gonk
RESOLVED FIXED
6 years ago
5 years ago

People

(Reporter: mdas, Assigned: tzimmermann)

Tracking

Trunk
B2G C1 (to 19nov)
x86
All
Points:
---
Dependency tree / graph

Firefox Tracking Flags

(Not tracked)

Details

Attachments

(3 attachments, 12 obsolete attachments)

53 bytes, text/plain
Details
807 bytes, text/plain
Details
165.45 KB, application/octet-stream
Details
For test automation (namely eideticker) on b2g pandaboards, we'd like to send events to particular pixels to emulate taps/swipes/etc. 

Using the patch in Bug 781039, we can use a mouse for swipes and clicks, but we're restricted because we cannot tap on coordinate x1,y1 and then tap on coordinate x2,y2. We are restricted by having to drag the mouse to the location and then click. To drag the mouse, we have to use relative motion from wherever the mouse was first initialized. You can't do something like "drag from x1,y1 to x2,y2". Right now, you can only do things like "move 1 pixel down". We are also restricted because we need to physically have a mouse plugged in to the device to just send events.

We can record and playback mouse events for the time being, but an input driver just for testing at the /dev/input layer would be really useful in the future. This way, we can tap at particular locations, and hopefully this can be extrapolated to doing pinch/zoom actions, and we won't be restricted by having to use hardware.
We'll need this sooner than later; it seems that a double-click mouse event doesn't trigger a zoom like a double-tap touch event that would. This means we can't get the NYtimes or Nightly zooming tests (http://wrla.ch/eideticker/dashboard/#/taskjs-scrolling/checkerboard) running on b2g pandaboard until we get this driver.

Comment 2

6 years ago
That sounds like a bug.. We should be sending largely the same touch events with the mouse as we do with a real touchscreen.
I'll start working on this as soon as I got my PandaBoard set up.
Assignee: nobody → tdz
Status: NEW → ASSIGNED
Malini, I'm currently figuring out how to offer a flexible, yet simple interface to the input driver.

The driver would be a loadable kernel module that receives a number of parameters (bus, vendor, product, version) that describe the emulated device. Emulating arbitrary devices should be possible this way.

Input events would be generated by writing simple commands to a device file; for example

  echo "touch 50 100 down" > /dev/<devfile>

to simulate pressing the pixel at 50,100. The driver would convert this to the respective input event. I'm currently investigating which comments are necessary or useful.

Would this fit your needs? Do you only need support for touchscreens, or other devices as well? Do you have any requirements for integrating the driver into the test framework?
s/comments/commands/
(In reply to Thomas Zimmermann from comment #4)
> Malini, I'm currently figuring out how to offer a flexible, yet simple
> interface to the input driver.
> 
> The driver would be a loadable kernel module that receives a number of
> parameters (bus, vendor, product, version) that describe the emulated
> device. Emulating arbitrary devices should be possible this way.
> 
> Input events would be generated by writing simple commands to a device file;
> for example
> 
>   echo "touch 50 100 down" > /dev/<devfile>
> 
> to simulate pressing the pixel at 50,100. The driver would convert this to
> the respective input event. I'm currently investigating which comments are
> necessary or useful.
> 
> Would this fit your needs? Do you only need support for touchscreens, or
> other devices as well? 

This sounds like a great approach to me. For our immediate purposes with Eideticker, we'll only need support for touchscreens at the moment, but I think in the future we can use drivers for light detection, etc, for other kinds of tests, so having the module be extensible for other drivers would be great.

> Do you have any requirements for integrating the driver into the test framework?

No, I think that if we can load the module using something like 'insmod' within adb shell or something, then it should be fine. 

Wlach, jgriffin: any additional thoughts?
(In reply to Malini Das [:mdas] from comment #6)
> (In reply to Thomas Zimmermann from comment #4)
> > Malini, I'm currently figuring out how to offer a flexible, yet simple
> > interface to the input driver.
> > 
> > The driver would be a loadable kernel module that receives a number of
> > parameters (bus, vendor, product, version) that describe the emulated
> > device. Emulating arbitrary devices should be possible this way.
> > 
> > Input events would be generated by writing simple commands to a device file;
> > for example
> > 
> >   echo "touch 50 100 down" > /dev/<devfile>
> > 
> > to simulate pressing the pixel at 50,100. The driver would convert this to
> > the respective input event. I'm currently investigating which comments are
> > necessary or useful.
> > 
> > Would this fit your needs? Do you only need support for touchscreens, or
> > other devices as well? 
> 
> This sounds like a great approach to me. For our immediate purposes with
> Eideticker, we'll only need support for touchscreens at the moment, but I
> think in the future we can use drivers for light detection, etc, for other
> kinds of tests, so having the module be extensible for other drivers would
> be great.

Yup, sounds about right, though I'm worried that there's a bit of misunderstanding about our requirements here. To give some context, the immediate goal would be to integrate this tool with the orangutan software used by Eideticker, the source to which you can find here:

https://github.com/wlach/orangutan/blob/master/orng.c

As you can see, we already support sending input events to two different types of devices (the LG-P999 and Galaxy Nexus). We translate high-level commands like "tap" or "drag" into lower level things on the client (similar to what you're proposing to do at a lower level).

I would prefer if this interface were kept relatively simple, and ideally be mostly the same as the existing touch-based device drivers for other phones. I.e. being able to echo human-readable strings to the device is less important to me -- I'd rather it provide something more consistent with the other platforms Orangutan supports.
Created attachment 662558 [details]
Preliminary Linux kernel module for input testing on PandaBoard

The attached kernel module can simulate arbitrary touch screens on the PandaBoard. This is not the final version, just a preview to show were things are going; source code will follow soon. To test it

 - copy the binary to the PandaBoard, and
 - load it via 'insmod ./evgen.ko'.

The file /dev/input/event2 should show up and b2g should reconfigure itself to use it ('Generic touch screen'). Then run

 'orng /dev/input/event2 <testscript>'

EventHub should report a number of input events over logcat, like the ones below.

> V/EventHub(  100): /dev/input/event2 got: t0=1019, t1=28472, type=3, code=53, value=560
> V/EventHub(  100): /dev/input/event2 got: t0=1019, t1=30609, type=3, code=54, value=200
> V/EventHub(  100): /dev/input/event2 got: t0=1019, t1=32318, type=0, code=0, value=0

The kernel module supports the parameters 'bustype', 'vendor', 'product', and 'version'. Setting them to specific (dezimal) values allows to select specific configurations.
Created attachment 662559 [details]
Testscript from orangutan README
William, I have the source code for the kernel module and I will add an utility for reading touch-screen properties from a kernel device file. Would you accept both for inclusion into orangutan? Otherwise I'd put all this into a new repository. I'd prefer the first option, but I'm fine with whatever you want.
(In reply to Thomas Zimmermann from comment #10)
> William, I have the source code for the kernel module and I will add an
> utility for reading touch-screen properties from a kernel device file. Would
> you accept both for inclusion into orangutan? Otherwise I'd put all this
> into a new repository. I'd prefer the first option, but I'm fine with
> whatever you want.

Patches to orangutan would be enthusiastically accepted. :) You can just do a pull request if you like (bugzilla patches would also be fine).
Created attachment 663497 [details]
Preliminary Linux kernel module for input testing on PandaBoard
Attachment #662558 - Attachment is obsolete: true
The latest versions of the kernel module and the tool for extracting the properties of an event device are available at

  https://github.com/tdz/orangutan

For your convenience, I attached a build of the module to this bug report.

When I added support for the 5 event devices of my Otoro phone, I notices that most of them have their vendor, product, etc set to 0. Selecting configurations from these values will probably not work. :(

Comment 14

5 years ago
The only thing that really matters afaik, is handling EVIOCGNAME correctly. The primary thing used on our devices for selecting the right configuration is the name returned by EVIOCGNAME.
(In reply to Michael Wu [:mwu] from comment #14)
> The only thing that really matters afaik, is handling EVIOCGNAME correctly.
> The primary thing used on our devices for selecting the right configuration
> is the name returned by EVIOCGNAME.

Names I've seen look good. But there is also code for selecting the configuration by vendor, product, etc. And there are at least a few files that follow this scheme.

Anyway, I'll keep it in mind and add support for selecting the device by name to the kernel module. Thanks!
Ok, so there are still some things to address here:

1. It seems like some percentage of the time, installing the binary driver linked in this bug on a pandaboard will cause the system to freeze (requiring a power cycle).
2. It would be really nice if we could install both the orng executable and this kernel module by default in pandaboard builds. Yes, I can patch eideticker to do this somehow, but I think we will detect problems (especially kernel problems like (1)) faster if these are installed by default.
Hi

> 1. It seems like some percentage of the time, installing the binary driver
> linked in this bug on a pandaboard will cause the system to freeze
> (requiring a power cycle).

The old module was compiled against kernel sources which did not match the kernel binary exactly. This might explain the problems. I rebuild the module against the current kernel that landed with [1]. I also pushed a change to github to install the kernel module automatically in /system/lib/modules/ when building for the PandaBoard. Have a look at [2] for the commit with the new module. Let me know if the system freezes persist.

> 2. It would be really nice if we could install both the orng executable and
> this kernel module by default in pandaboard builds. Yes, I can patch
> eideticker to do this somehow, but I think we will detect problems
> (especially kernel problems like (1)) faster if these are installed by
> default.

I added an Android.mk to my fork of the orangutan repository. Simply clone it to B2G's external/ directory and do an 'eng' build. orng and mkdevinfo should show up in /system/xbin/. Once you pulled all those changes into your repository and if no one objects, we could also pull and install the tools automatically in B2G.

Regards
Thomas

[1] https://bugzilla.mozilla.org/show_bug.cgi?id=778248
[2] https://github.com/tdz/android-device-panda/commit/b71a149053b736208fcfa1bfe262d0a675dc5526
The new kernel module does not seem to be crashing for me, so that's good. Thanks!

Unfortunately I've still been having quite a bit of trouble getting the pandaboard to register gestures properly. Gecko is definitely receiving gestures properly, but I'm not seeing anything happen on screen as a result. I suspect there's some kind of issue recognizing the device properly. 

I added some debugging information to the gonk inputreader, and noticed some suspicious differences between how touch information was being processed between the orangutan module and the native unagi touchscreen driver.

Here's a dump of some inputreader output on unagi:

D/InputReader( 1399): Input event: device=7 type=0x0003 scancode=0x0039 keycode=0x0000 value=0x00000000 flags=0x00000000
D/InputReader( 1399): Input event: device=7 type=0x0003 scancode=0x0030 keycode=0x0000 value=0x00000015 flags=0x00000000
D/InputReader( 1399): Input event: device=7 type=0x0003 scancode=0x0032 keycode=0x0000 value=0x00000001 flags=0x00000000
D/InputReader( 1399): Input event: device=7 type=0x0003 scancode=0x0035 keycode=0x0000 value=0x00000073 flags=0x00000000
D/InputReader( 1399): Input event: device=7 type=0x0003 scancode=0x0036 keycode=0x0000 value=0x00000108 flags=0x00000000
D/InputReader( 1399): Input event: device=7 type=0x0003 scancode=0x003a keycode=0x0000 value=0x00000015 flags=0x00000000
D/InputReader( 1399): Input event: device=7 type=0x0000 scancode=0x0002 keycode=0x0000 value=0x00000000 flags=0x00000000
D/InputReader( 1399): Input event: device=7 type=0x0000 scancode=0x0000 keycode=0x0000 value=0x00000000 flags=0x00000000
D/InputReader( 1399): syncTouch: pointerCount 1 -> 1, touching ids 0x80000000 -> 0x80000000, hovering ids 0x00000000 -> 0x00000000
D/InputReader( 1399): BatchSize: 2 Count: 2

Here's some on the orng input reader on the panda:

D/InputReader( 1266): BatchSize: 6 Count: 6
D/InputReader( 1266): Input event: device=3 type=0x0003 scancode=0x0030 keycode=0x0000 value=0x00000020 flags=0x00000000
D/InputReader( 1266): Input event: device=3 type=0x0003 scancode=0x0032 keycode=0x0000 value=0x00000004 flags=0x00000000
D/InputReader( 1266): Input event: device=3 type=0x0003 scancode=0x0035 keycode=0x0000 value=0x00000082 flags=0x00000000
D/InputReader( 1266): Input event: device=3 type=0x0003 scancode=0x0036 keycode=0x0000 value=0x00000064 flags=0x00000000
D/InputReader( 1266): Input event: device=3 type=0x0003 scancode=0x003a keycode=0x0000 value=0x0000005a flags=0x00000000
D/InputReader( 1266): Input event: device=3 type=0x0000 scancode=0x0000 keycode=0x0000 value=0x00000000 flags=0x00000000
D/InputReader( 1266): syncTouch: pointerCount 1 -> 1, no pointer ids
D/InputReader( 1266): Gestures: HOVER
D/InputReader( 1266): Gestures: finishPreviousGesture=false, cancelPreviousGesture=false, currentGestureMode=4, currentGestureIdBits=0x80000000, lastGestureMode=4, lastGestureIdBits=0x00000000
D/InputReader( 1266):   currentGesture[0]: index=0, toolType=1, x=446.072, y=0.000, pressure=0.000

So there's various suspicious things here:

1. It looks like we're expliciting registering multitouch gestures from orangutan input, but not from the unagi input (?).
2. It looks like the x/y coordinates of the gesture on the pandaboard are very wrong. I'm sending a sequence of swipe gestures between 100 and 400 in the x, y axises, but the pointer values it's getting are way off that (see http://people.mozilla.com/~wlachance/inputreader-log and http://people.mozilla.com/~wlachance/orng-script.txt for the orangutan script).
3. Looking at the source to the orangutan kernel module, it looks like the name of the virtual input device driver should be "Generic touch screen". However, when I run getevent, I see that it is recognized as an "atmel-touchscreen":

(testcase)wlach@popsicle:~/tmp/testcase$ adb shell getevent
add device 1: /dev/input/event2
  name:     "atmel-touchscreen"
could not get driver version for /dev/input/mice, Not a typewriter
add device 2: /dev/input/event1
  name:     "Panda Headset Jack"
add device 3: /dev/input/event0
  name:     "gpio-keys"

If you want to try testing input yourself on the pandaboard, you can do the following:

1. Clone a copy of git://github.com/wlach/mozbase.git
2. Checkout my mozb2g-hacked branch (git checkout remotes/origin/mozb2g-hacked)
3. Setup a virtualenv (mkdir mozb2g && cd mozb2g && virtualenv . && source bin/activate)
4. Setup my copy of mozbase inside the virtualenv (cd ../mozb2g-hacked && python setup_development.py)
5. Run this script with an activated pandaboard connected to your computer via adb: http://people.mozilla.com/~wlachance/orng-testcase.py
Flags: needinfo?(tzimmermann)
So after talking to :cjones on irc, it looks like we shouldn't be interpreting gestures at all. I am beginning to wonder if we're not setting the device type correctly. It looks like the right thing for B2G is a touchscreen (INPUT_PROP_DIRECT). If we're setting the virtual driver to be a pointer (INPUT_PROP_POINTER) we're bound to run into unexpected behaviour.
The supported input devices are stored in the file 'devspec.h'. The 'Generic touch screen' is the fall-back, which is used if no emulated device is specified. The only real input devices supported by the kernel module are currently those of the Otoro and the PandaBoard. I don't have an Unagi.

Do you have tests for the Otoro? A better test would be to load the kernel module with

  insmod orng.ko names=atmel-touchscreen

and compare the behaviour to a real Otoro's touch screen.

We should probably add Unagi support as well. Could you run mkdevinfo on all /dev/input/event files and send me the output for each?
Flags: needinfo?(tzimmermann)
Created attachment 678538 [details]
Orng kernel module with 720p touch screen

Run

  insmod orng.ko names=720p\ touchscreen

to create a 720p touch screen.
Attachment #663497 - Attachment is obsolete: true
Created attachment 678545 [details]
Orng kernel module with 720p touch screen
Attachment #678538 - Attachment is obsolete: true
Created attachment 678553 [details]
Orng kernel module with 720p touch screen
Attachment #678545 - Attachment is obsolete: true
Created attachment 678557 [details]
Orng kernel module with 720p touch screen
Attachment #678553 - Attachment is obsolete: true
Created attachment 678568 [details]
Orng kernel module with 720p touch screen

Load the latest kernel module with

  insmod ./orng.ko names=720p_touchscreen

to get a generic 720p-compatible touchscreen.
Attachment #678557 - Attachment is obsolete: true
Created attachment 678572 [details]
Input properties of galaxy nexus (output of getevent -i)

Here's the input properties of the galaxy nexus device. It would be a good candidate to emulate, as it has the same resolution that we want for the pandaboard (1280x720 aka 720p).
Created attachment 678587 [details]
Orng kernel module with 720p touch screen
Attachment #678568 - Attachment is obsolete: true
Created attachment 678816 [details]
Orng kernel module with 720p touch screen
Attachment #678587 - Attachment is obsolete: true
Created attachment 678863 [details]
Orng kernel module with 720p touch screen
Attachment #678816 - Attachment is obsolete: true
Created attachment 678871 [details]
Orng kernel module with 720p touch screen
Attachment #678863 - Attachment is obsolete: true
Created attachment 678893 [details]
Orng kernel module with 720p touch screen
Attachment #678871 - Attachment is obsolete: true
Created attachment 678909 [details]
Orng kernel module with 720p touch screen
Attachment #678893 - Attachment is obsolete: true
Target Milestone: --- → B2G C1 (to 19nov)
Since I already get deadline remainders for this bug I just wanted to mention that the work here has been done. I'll clean up the code and add the tools to the external packages on the PandaBoard. The kernel driver will also be added to the repo.
The driver is available at

  https://github.com/mozilla-b2g/orangutan

It emulates the input devices of the Otoro, PandaBoard and Unagi, and provides a generic touch-screen device that is compatible with the PandaBoard's output.

The binary and repository is available on the PandaBoard with the commits

  https://github.com/mozilla-b2g/b2g-manifest/commit/9899ba6bc18cc1f91eb1c04750487fb20b51532e
  https://github.com/mozilla-b2g/android-device-panda/commit/599e736cb24dff10b949edcac3a6e5a76bce2144
Status: ASSIGNED → RESOLVED
Last Resolved: 5 years ago
Resolution: --- → FIXED
You need to log in before you can comment on or make changes to this bug.