Closed Bug 1574449 Opened 2 years ago Closed 2 years ago

3.27 - 5.73% tp5o responsiveness / tp5o_webext responsiveness (linux64-shippable) regression on push d685ba705bc2d2560308f9aa8ef272d03c602fad (Thu August 8 2019)

Categories

(Core :: JavaScript: GC, defect, P3)

defect

Tracking

()

RESOLVED FIXED
mozilla70
Tracking Status
firefox-esr60 --- unaffected
firefox-esr68 --- unaffected
firefox68 --- unaffected
firefox69 --- unaffected
firefox70 --- fixed

People

(Reporter: marauder, Assigned: allstars.chh)

References

(Regression)

Details

(Keywords: perf, regression, talos-regression)

Attachments

(1 file)

Talos has detected a Firefox performance regression from push:

https://hg.mozilla.org/integration/autoland/pushloghtml?changeset=d685ba705bc2d2560308f9aa8ef272d03c602fad

As author of one of the patches included in that push, we need your help to address this regression.

Regressions:

6% tp5o_webext responsiveness linux64-shippable opt e10s stylo 1.68 -> 1.77
6% tp5o responsiveness linux64-shippable opt e10s stylo 1.39 -> 1.47
3% tp5o_webext responsiveness linux64-shippable opt e10s stylo 1.67 -> 1.73

You can find links to graphs and comparison views for each of the above tests at: https://treeherder.mozilla.org/perf.html#/alerts?id=22492

On the page above you can see an alert for each affected platform as well as a link to a graph showing the history of scores for this test. There is also a link to a treeherder page showing the Talos jobs in a pushlog format.

To learn more about the regressing test(s), please see: https://wiki.mozilla.org/TestEngineering/Performance/Talos

For information on reproducing and debugging the regression, either on try or locally, see: https://wiki.mozilla.org/TestEngineering/Performance/Talos/Running

*** Please let us know your plans within 3 business days, or the offending patch(es) will be backed out! ***

Our wiki page outlines the common responses and expectations: https://wiki.mozilla.org/TestEngineering/Performance/Talos/RegressionBugsHandling

Component: Performance → JavaScript: GC
Flags: needinfo?(jcoppeard)
Flags: needinfo?(allstars.chh)
Product: Testing → Core
Assignee: nobody → allstars.chh
Status: NEW → ASSIGNED
Flags: needinfo?(allstars.chh)

My guess would be that this is the change to incremental sweeping of the atoms table (i.e. AtomsTable::sweepIncrementally). I didn't expect this to make such a difference, but the atoms table can get very large so it's not that surprising I guess.

We should try reverting this part of the patch and hopefully that will fix it.

If it does then we can look at optimising this part further. We should be able to check whether the atom is permanent and if not call IsAboutToBeFinalizedDuringSweep which should be more efficient.

Flags: needinfo?(jcoppeard)
Attachment #9086581 - Attachment description: Bug 1574449 - revert sweekAtomIncrementally. → Bug 1574449 - revert sweepAtomIncrementally.
Pushed by allstars.chh@gmail.com:
https://hg.mozilla.org/integration/autoland/rev/79ab941f4f41
revert sweepAtomIncrementally. r=jonco
Status: ASSIGNED → RESOLVED
Closed: 2 years ago
Resolution: --- → FIXED

== Change summary for alert #22602 (as of Wed, 21 Aug 2019 06:09:46 GMT) ==

Improvements:

6% tp5o responsiveness linux64-shippable opt e10s stylo 1.50 -> 1.41
5% tabswitch windows10-64-shippable-qr opt e10s stylo 7.75 -> 7.32

For up to date results, see: https://treeherder.mozilla.org/perf.html#/alerts?id=22602

Duplicate of this bug: 1574981
You need to log in before you can comment on or make changes to this bug.