Perma Windows 10 AArch TEST-UNEXPECTED-TIMEOUT | automation.py | application timed out after 370 seconds with no output on attempt to run dom\media\mediasession\test\mochitest.ini
Categories
(Core :: Audio/Video, defect, P3)
Tracking
()
Tracking | Status | |
---|---|---|
firefox-esr68 | --- | unaffected |
firefox76 | --- | unaffected |
firefox77 | --- | unaffected |
firefox78 | --- | fixed |
People
(Reporter: aryx, Assigned: away)
References
(Regression)
Details
(Keywords: intermittent-failure, regression)
EDIT: Also applies to normal central tasks.
Today, 4/4 runs failed to run the mochitest media tasks on Windows 10 AArch opt for the central-as-beta simulations:
central-as-early-beta simulation
central-as-late-beta simulation
Log example: https://treeherder.mozilla.org/logviewer.html#/jobs?job_id=302453360&repo=try&lineNumber=1707
[task 2020-05-15T13:26:29.403Z] 13:26:29 INFO - TEST-SKIP | dom/media/webspeech/recognition/test/test_success_without_recognition_service.html | took 0ms
[task 2020-05-15T13:26:29.403Z] 13:26:29 INFO - Running manifest: dom\media\mediasession\test\mochitest.ini
[task 2020-05-15T13:26:30.054Z] 13:26:30 INFO - C:\tasks\task_1589474106\build\tests\bin\pk12util.exe: PKCS12 IMPORT SUCCESSFUL
[task 2020-05-15T13:26:30.285Z] 13:26:30 INFO - Increasing default timeout to 240.0 seconds
...
[task 2020-05-15T13:26:36.762Z] 13:26:36 INFO - GECKO(12992) | 1589549196758 Marionette DEBUG 3 -> [0,8,"WebDriver:DeleteSession",{}]
[task 2020-05-15T13:26:36.767Z] 13:26:36 INFO - runtests.py | Waiting for browser...
[task 2020-05-15T13:26:36.767Z] 13:26:36 INFO - GECKO(12992) | 1589549196760 Marionette DEBUG 3 <- [1,8,null,{"value":null}]
[task 2020-05-15T13:26:36.767Z] 13:26:36 INFO - GECKO(12992) | 1589549196763 Marionette DEBUG Closed connection 3
[task 2020-05-15T13:32:46.773Z] 13:32:46 INFO - Buffered messages finished
[task 2020-05-15T13:32:46.774Z] 13:32:46 ERROR - TEST-UNEXPECTED-TIMEOUT | automation.py | application timed out after 370 seconds with no output
[task 2020-05-15T13:32:46.776Z] 13:32:46 ERROR - Force-terminating active process(es).
For yesterday's successful run, this is followed by:
[task 2020-05-14T14:57:58.563Z] 14:57:58 INFO - SimpleTest START
[task 2020-05-14T14:57:58.568Z] 14:57:58 INFO - TEST-START | dom/media/mediasession/test/test_active_mediasession_within_page.html
Reporter | ||
Comment 1•4 years ago
|
||
Pushlog between yesterday's and today's simulation is https://hg.mozilla.org/mozilla-central/pushloghtml?fromchange=8af03d77567d3104c00efc5e6161e4910d835504&tochange=5a0a960a8d555795d4d1db432090fec007850716
Alastar, are you aware of a change which causes these media tests to fail (for mediasession) on Windows 10 AArch?
Comment 2•4 years ago
|
||
In [1], I saw multiple timeout when running mochitest.ini
for different folders, they are not related with the specific test file. I guess it's probably a testing framework issue, but I have no idea how to indentify that.
[1] https://firefoxci.taskcluster-artifacts.net/DjEwJJfVQK24gQfpm41QMw/0/public/logs/live_backing.log
Bryce, do you have any thought about this issue?
Thank you.
Comment 3•4 years ago
|
||
I wonder if it's related with bug1557741. It's on the range of the simulation, and it seems related with the testing framework.
Chris, is it possible that disabling geckodriver when automation
is false affects the process of running the mochitest.ini?
Thank you.
Comment 4•4 years ago
|
||
(In reply to Alastor Wu [:alwu] from comment #3)
I wonder if it's related with bug1557741. It's on the range of the simulation, and it seems related with the testing framework.
Chris, is it possible that disabling geckodriver when
automation
is false affects the process of running the mochitest.ini?
Thank you.
Hmmm, I don't see how. I didn't think geckodriver was used during mochitests, and the patch to disable building geckodriver hasn't landed yet. I also don't see any references to geckodriver in the test log.
Reporter | ||
Comment 5•4 years ago
|
||
Sorry, this is actually a regression on central from the upgrade to clang10 in bug 1616692.
Updated•4 years ago
|
Updated•4 years ago
|
Updated•4 years ago
|
Reporter | ||
Updated•4 years ago
|
Was this test ever intermittent in this way in the past? In other words, do we believe that bug 1616692 turned an existing intermittent into a perma fail? Or is this a brand new perma failure?
Reporter | ||
Comment 7•4 years ago
|
||
The current type of failure is new. At the end of March and early April, it looked like this.
In the short term this was addressed by backout of bug 1616692. The root cause is being investigated in bug 1639318.
Comment hidden (Intermittent Failures Robot) |
Updated•4 years ago
|
Assignee | ||
Comment 10•4 years ago
|
||
Resolving fixed based on comment 8.
Updated•4 years ago
|
Comment hidden (Intermittent Failures Robot) |
Description
•