Closed Bug 1572709 Opened 5 years ago Closed 5 years ago

3.86 - 13.21% tsvg_static (windows10-64-shippable) regression on push 5a8a2d0bc8590ded8b604b86bf1633def0669f6e (Wed August 7 2019)

Categories

(Core :: Layout: Block and Inline, defect)

defect
Not set
normal

Tracking

()

RESOLVED INVALID
mozilla70

People

(Reporter: marauder, Unassigned)

References

(Regression)

Details

(Keywords: perf, regression, talos-regression)

Talos has detected a Firefox performance regression from push:

https://hg.mozilla.org/integration/autoland/pushloghtml?changeset=5a8a2d0bc8590ded8b604b86bf1633def0669f6e

As author of one of the patches included in that push, we need your help to address this regression.

Regressions:

13% tsvg_static windows10-64-shippable opt e10s stylo 33.71 -> 38.16
4% tsvg_static windows10-64-shippable opt e10s stylo 35.09 -> 36.45

You can find links to graphs and comparison views for each of the above tests at: https://treeherder.mozilla.org/perf.html#/alerts?id=22373

On the page above you can see an alert for each affected platform as well as a link to a graph showing the history of scores for this test. There is also a link to a treeherder page showing the Talos jobs in a pushlog format.

To learn more about the regressing test(s), please see: https://wiki.mozilla.org/TestEngineering/Performance/Talos

For information on reproducing and debugging the regression, either on try or locally, see: https://wiki.mozilla.org/TestEngineering/Performance/Talos/Running

*** Please let us know your plans within 3 business days, or the offending patch(es) will be backed out! ***

Our wiki page outlines the common responses and expectations: https://wiki.mozilla.org/TestEngineering/Performance/Talos/RegressionBugsHandling

Blocks: 1562138
Component: Performance → Layout: Block and Inline
Flags: needinfo?(svoisen)
Flags: needinfo?(charles.w.marlow)
Product: Testing → Core
Regressed by: 1411922
Target Milestone: --- → mozilla70
Version: Version 3 → Trunk

Looking at the graph, it's not clear to me whether there is a definite regression here, and certainly not clear that bug 1411922 is responsible.

Prior to 5a8a2d0bc8590ded8b604b86bf1633def0669f6e, the graph seems to have been kinda bi-modal for a long time, with a low around 32 and high around 38 (and occasional outliers much larger). It's true that around the time of this landing, it seems to have settled more frequently at the higher end of the range - though not exclusively, there are still some low values (e.g. 3290214a8054 shows 31.73).

Assuming there is in fact a persistent regression whereby the higher values are now significantly more frequent than before, the culprit could be any patch that landed around that time; it's not necessarily the first push of the current series of "high" values, as such values were already occurring on perhaps 50% or so of test runs.

I've retriggered some test jobs to try and get an idea of how consistent this seems to be.

I retriggered a bunch of test runs on 5a8a2d0bc859, and also on an earlier push (b7f5d743d08b) and a later one (87890c29a8ea), and it looks to me like the spread of timings is pretty similar both before and after the bug 1411922 landing.

Flags: needinfo?(charles.w.marlow)
No longer blocks: 1572800

Looking at the graph for the past 60 days, it looks basically flat (there's a constant solid band of measurements in the 32-38 range, with sporadic much-higher measurements sprinkled throughout).

August 8th seems to have been unlucky to receive several of the sporadic higher-than-usual measurements on a single day, but that doesn't seem to have been a persistent issue -- just bad luck on that one day.

So, I think this alert was just an artifact from randomness in the data, and there's nothing to be concerned about or act on here.

Status: NEW → RESOLVED
Closed: 5 years ago
Resolution: --- → INVALID
Flags: needinfo?(svoisen)
Has Regression Range: --- → yes
You need to log in before you can comment on or make changes to this bug.