14.62% office loadtime (Linux) regression on Mon December 19 2022
Categories
(Firefox :: Messaging System, defect, P1)
Tracking
()
| Tracking | Status | |
|---|---|---|
| firefox-esr102 | --- | unaffected |
| firefox108 | --- | unaffected |
| firefox109 | --- | unaffected |
| firefox110 | --- | affected |
People
(Reporter: afinder, Assigned: dmosedale)
References
(Regression)
Details
(Keywords: perf, perf-alert, regression)
Perfherder has detected a browsertime performance regression from push cad1670a2076f68543bb876715548b2a653f8c16. As author of one of the patches included in that push, we need your help to address this regression.
Regressions:
| Ratio | Test | Platform | Options | Absolute values (old vs new) | Performance Profiles |
|---|---|---|---|---|---|
| 15% | office loadtime | linux1804-64-shippable-qr | cold fission webrender | 620.58 -> 711.33 | Before/After |
Details of the alert can be found in the alert summary, including links to graphs and comparisons for each of the affected tests. Please follow our guide to handling regression bugs and let us know your plans within 3 business days, or the offending patch(es) may be backed out in accordance with our regression policy.
If you need the profiling jobs you can trigger them yourself from treeherder job view or ask a sheriff to do that for you.
For more information on performance sheriffing please see our FAQ.
Comment 1•2 years ago
|
||
Set release status flags based on info from the regressing bug 1805470
| Assignee | ||
Comment 2•2 years ago
|
||
Looking at https://treeherder.mozilla.org/perfherder/graphs?highlightAlerts=1&highlightChangelogData=1&highlightCommonAlerts=0&series=autoland,3776855,1,13&timerange=1209600&zoom=1671318116219,1671557825128,465.1410327384501,886.6295167318092 there is certainly a cluster that starts on the commit in question (19:21) and goes for a while. That said, looking just a bit earlier that day (15:40 - 17:xx) there is a whole bunch of clustering there too. This makes me wonder if this isn't really just network traffic or latency that lasts for a certain period of time. Looking at the bigger picture, it's not clear to me that there's a trend outside of the normal variance (which is a pretty wide band) that is ongoing.
Part of this is that it's not at all obvious to me how the code changes in the commit in question could have made a difference like this.
There's a bunch more debugging one could do, but I sort of suspect that the simplest diagnostic next steps would be simply to back it out, see if the timing gets better, and if not, re-land it and see what happens then...
What do you think?
| Reporter | ||
Comment 3•2 years ago
|
||
(In reply to Dan Mosedale (:dmosedale, :dmose) from comment #2)
Looking at https://treeherder.mozilla.org/perfherder/graphs?highlightAlerts=1&highlightChangelogData=1&highlightCommonAlerts=0&series=autoland,3776855,1,13&timerange=1209600&zoom=1671318116219,1671557825128,465.1410327384501,886.6295167318092 there is certainly a cluster that starts on the commit in question (19:21) and goes for a while. That said, looking just a bit earlier that day (15:40 - 17:xx) there is a whole bunch of clustering there too. This makes me wonder if this isn't really just network traffic or latency that lasts for a certain period of time. Looking at the bigger picture, it's not clear to me that there's a trend outside of the normal variance (which is a pretty wide band) that is ongoing.
Part of this is that it's not at all obvious to me how the code changes in the commit in question could have made a difference like this.
There's a bunch more debugging one could do, but I sort of suspect that the simplest diagnostic next steps would be simply to back it out, see if the timing gets better, and if not, re-land it and see what happens then...
What do you think?
Retriggered on the following interval (dd4482632694 - cad1670a2076) to see if there are other noticeable changes in the trend, although it might just be noise in the graph. If that's the case, we can close the bug and the alert as won't fix and invalid, respectively.
| Reporter | ||
Updated•2 years ago
|
Updated•2 years ago
|
Updated•2 years ago
|
| Reporter | ||
Comment 4•2 years ago
|
||
Closing the alert and defect since graph is showing a noisy pattern after backfilling.
| Reporter | ||
Updated•2 years ago
|
Description
•