It is really hard to debug JSMarionette tests on try-server. One of main reasons is that we could not see how the test work on try-server, because it runs on remote xvfb. So I think we could provide a debugging tool to record a specified test.
The below are the steps to use the debugging tool: 1. Request a loaner machine b2g_ubuntu64_vm and login server. 2. Clone gaia and go to it's root folder. 3. Run `make test-integration-test TEST_FILES=path/to/the/test_test.js RECORD=1` 4. The video will be stored in path/to/gaia/video/20140910175633.mp4 5. We could download the video from the server and save it in local to start debugging. How do you guy think about it?
for what its worth, when you get a loaner, releng gives you VNC access which is VNC connected to the XSession the tests run on (aiui)
This does seem like it would be very useful. Would probably need to make the setup easier though or very few people would use it - possibly some in-tree json config file that just points to a list of tests you want videos of.
Some extra context; I've got work in place with the email app that records a synchronized video and logs. Evan and I were talking about making it more generic and supporting all apps. There's an example where you can click on timestamps up at: https://clicky.visophyte.org/live/loggest-viewer/www/?url=/examples/loggest-viewer/20140824/message_reading.jsons The core part of that right now is driven by an email-specific helper at https://github.com/asutherland/gaia/blob/email-happy-tests/apps/email/test/marionette/lib/recorder_client_helper.js Because mocha makes it hard for anyone but the reporter to have detailed failure data, but not impossible to tell there was a failure, we can detect failures. Right now anything that's registered with the recorder has a chance to fill in details in a JSON-y object about the state of the system. The email app records all the cards that are visible and their display states. One could obviously get much more fancy. My general gameplan, in-progress and finished, was to: - Put the logs and videos in the artifacts subtree for extraction by our build infrastructure or upload via our blobber infrastructure or however that works. - By default, we nuke the video and logs if there was no failure *unless a specific flag has been set like RECORD=1* to keep the successful result around. The successful results are potentially quite interesting, but why burn disk space if no one cares? An important note about what I discovered was that we really need to record the video with no compression and only then later go back and transcode to webm. When I had avconv directly record to webm, it dropped all kinds of frames (even on a quad-core hyperthreaded desktop with 32GB of RAM), and the webm file's time-codes got messed up. It seems like webm is capable of noting the lost frames, but it didn't happen, the data doesn't go in the file. The file just got much shorter and the video sync gets trashed. Which is another reason to delete the files if we don't want the videos, because that way we can just skip the transcode process and save us a ton of time. With the transcode, I was able to reliably get 30fps and it's possible we might even be able to crank it up to 60fps, but it's unclear to me how sucky our builders may or may not be. The log viewer repo is at https://github.com/asutherland/loggest-viewer and should not be confused with the log viewer used by gaia-email-libs-and-more. That one is https://github.com/asutherland/arbitrarypushlog. You can tell the difference because loggest-viewer is using React. Woo! React! I'm planning to get back to the email integration tests and the videos and stuff shortly now that email's XOAuth2 foray is hopefully over.