Closed Bug 1351615 Opened 7 years ago Closed 7 years ago

2.51 - 3.53% sessionrestore_no_auto_restore / tsvgr_opacity (linux64) regression on push e8a7790bd2685ca77713d2bf8b165e7e162a443d (Mon Mar 27 2017)

Categories

(Firefox for Android Graveyard :: Audio/Video, defect)

55 Branch
x86_64
Linux
defect
Not set
normal

Tracking

(firefox55 affected)

RESOLVED WONTFIX
Tracking Status
firefox55 --- affected

People

(Reporter: igoldan, Assigned: alwu)

References

Details

(Keywords: perf, regression, talos-regression)

Talos has detected a Firefox performance regression from push e8a7790bd2685ca77713d2bf8b165e7e162a443d. As author of one of the patches included in that push, we need your help to address this regression.

Regressions:

  3%  tsvgr_opacity summary linux64 pgo      368.79 -> 378.04


You can find links to graphs and comparison views for each of the above tests at: https://treeherder.mozilla.org/perf.html#/alerts?id=5683

On the page above you can see an alert for each affected platform as well as a link to a graph showing the history of scores for this test. There is also a link to a treeherder page showing the Talos jobs in a pushlog format.

To learn more about the regressing test(s), please see: https://wiki.mozilla.org/Buildbot/Talos/Tests

For information on reproducing and debugging the regression, either on try or locally, see: https://wiki.mozilla.org/Buildbot/Talos/Running

*** Please let us know your plans within 3 business days, or the offending patch(es) will be backed out! ***

Our wiki page outlines the common responses and expectations: https://wiki.mozilla.org/Buildbot/Talos/RegressionBugsHandling
Component: Untriaged → Audio/Video
Product: Firefox → Firefox for Android
Blocks: 1347648
Blocks: 1346783
:alwu, is there any update here?  Should we verify a win by backing out this patch on the tip of trunk via try push?
Flags: needinfo?(alwu)
Hi, Joel,
I have no idea why it would cause performance regression, how to verify whether the performance has improvement if I push backout patch via try server?
And how do you usually do for this kind of regression?
Thanks!
Flags: needinfo?(jmaher)
thanks :alwu.  If you push to try, I recommend:
# update to recent code
# ./mach try -b o -p linux64 -u none -t svgr -rebuild 5
# <backout code>
# ./mach try -b o -p linux64 -u none -t svgr -rebuild 5

when the results are finished, you will see the 's' jobs in treeherder, click on the second push (the backout), and when looking at a 's' job there is a performance tab that will show up by default.  There you will see a 'compare this to...' link, click that and you can fill in the branch/revision for the baseline (in this case it will be your first try push).

Once there, you can see what changes you made, anything that is confidence high (red color) is worth looking at.

In looking back on this bug and examining the results, I would recommend closing this as wontfix for a few reasons:
1) this is pgo only, a single test and a small regression.  If this was opt as well, it would be easier to debug- pgo only regressions in many cases are issues with the PGO process which is unrelated to your code.
2) looking at the graph of data, this did go up, but down and up a few times- this could be related to the pgo process and the specific builds generated.  It went up/down in the range of noise, just higher/lower bounds of values we had seen in a larger time window.

If there are reasons to not mark this as wontfix, please reopen the bug!  I apologize for the randomization, I should have looked in more detail prior to asking for an update.
Status: NEW → RESOLVED
Closed: 7 years ago
Flags: needinfo?(jmaher)
Flags: needinfo?(alwu)
Resolution: --- → WONTFIX
Product: Firefox for Android → Firefox for Android Graveyard
You need to log in before you can comment on or make changes to this bug.