2.15 - 4.02% raptor-tp6-amazon-firefox-cold / raptor-tp6-amazon-firefox-cold fcp (linux64-shippable-qr) regression on push 6b672d4b8a21db6714825415b0c4614aceb85892 (Thu July 30 2020)
Categories
(Core :: Widget, defect, P2)
Tracking
()
Tracking | Status | |
---|---|---|
firefox-esr68 | --- | unaffected |
firefox-esr78 | --- | unaffected |
firefox79 | --- | unaffected |
firefox80 | --- | unaffected |
firefox81 | --- | wontfix |
firefox82 | --- | wontfix |
People
(Reporter: Bebe, Assigned: spohl)
References
(Regression)
Details
(Keywords: perf, perf-alert, regression)
Attachments
(1 obsolete file)
Perfherder has detected a raptor performance regression from push 6b672d4b8a21db6714825415b0c4614aceb85892. As author of one of the patches included in that push, we need your help to address this regression.
Regressions:
4% raptor-tp6-amazon-firefox-cold fcp linux64-shippable-qr opt webrender 818.46 -> 851.33
2% raptor-tp6-amazon-firefox-cold linux64-shippable-qr opt webrender 921.77 -> 941.60
Details of the alert can be found in the alert summary, including links to graphs and comparisons for each of the affected tests. Please follow our guide to handling regression bugs and let us know your plans within 3 business days, or the offending patch(es) will be backed out in accordance with our regression policy.
For more information on performance sheriffing please see our FAQ.
Reporter | ||
Updated•5 years ago
|
Reporter | ||
Updated•5 years ago
|
Comment 1•5 years ago
|
||
Set release status flags based on info from the regressing bug 1640195
Assignee | ||
Comment 2•5 years ago
|
||
Looking into this. From the documentation it sounds like up to 3% regression is acceptable? There is nothing in this patch that should have caused this regression, especially on Linux. Will try to isolate this with try runs.
Assignee | ||
Comment 3•5 years ago
|
||
Assignee | ||
Comment 4•5 years ago
|
||
I have sent the attached patch to try to start isolating this regression, but I can't seem to find the proper performance test suite to run. I have also tried adding the proper jobs to the try run after launching it, but I don't believe I have found the correct test suites to run there either. Could you tell me how to properly start these performance runs so that I can start comparing with the baseline?
Try run: https://treeherder.mozilla.org/#/jobs?repo=try&revision=2401bc1d1ab6381d89c9156e88f3bb3800053a7e
Assignee | ||
Updated•5 years ago
|
Comment 5•5 years ago
•
|
||
If I look at the try push (https://treeherder.mozilla.org/#/jobs?repo=try&revision=2401bc1d1ab6381d89c9156e88f3bb3800053a7e&selectedTaskRun=fEJ53AYHRCGMxOFxTDM0Tw.0)
I see: Linux x64 WebRender Shippable opt
raptor-tp6-amazon-firefox-cold confidence opt webrender: 84
raptor-tp6-amazon-firefox-cold dcf opt webrender: 777
raptor-tp6-amazon-firefox-cold fcp opt webrender: 849
raptor-tp6-amazon-firefox-cold fnbpaint opt webrender: 789
raptor-tp6-amazon-firefox-cold loadtime opt webrender: 1460
raptor-tp6-amazon-firefox-cold not-replayed opt webrender: 379
raptor-tp6-amazon-firefox-cold opt webrender: 933.69
raptor-tp6-amazon-firefox-cold replayed opt webrender: 2045
But in perfherder:
raptor-tp6-amazon-firefox opt webrender linux64-shippable-qr claims "No results".
And I don't understand why - it's also why the comparison won't work.
Comment 6•5 years ago
|
||
Looking at the alert graph, zooming in: https://treeherder.mozilla.org/perf.html#/graphs?highlightAlerts=1&series=autoland,2526144,1,10&timerange=1209600&zoom=1596091311716,1596152341847,742.7842265341017,895.544569056711
Right before this push something caused results to go all over the place. It's not clear the identified push actually caused the regression.
Comment 7•5 years ago
|
||
Hi Jim, can you please (let) triage this bug? Thank you!
Comment 8•5 years ago
|
||
Hi Steven, is that patch ready for review?
![]() |
||
Updated•5 years ago
|
Assignee | ||
Comment 9•5 years ago
|
||
No, it isn't. There are several open questions (see comment 4 to comment 6) and we are awaiting feedback from :Bebe.
Reporter | ||
Comment 10•5 years ago
|
||
Sorry I was on PTO:
If I understand correctly you can find the results here:
And the subtest view here:
https://treeherder.mozilla.org/perf.html#/comparesubtest?originalProject=mozilla-central&newProject=try&newRevision=2401bc1d1ab6381d89c9156e88f3bb3800053a7e&originalSignature=2536477&newSignature=2541348&framework=10&selectedTimeRange=172800
Comment 11•5 years ago
|
||
The severity field is not set for this bug.
:jimm, could you have a look please?
For more information, please visit auto_nag documentation.
Assignee | ||
Updated•5 years ago
|
Assignee | ||
Updated•5 years ago
|
Assignee | ||
Comment 12•5 years ago
|
||
Florin, I think more generally, could you take another look at the reported performance regression and confirm for us one more time that this was a legitimate regression and that it was truly caused by 6b672d4b8a21db6714825415b0c4614aceb85892?
The more we've tried to confirm this on our end the more we lost confidence that this is either a legitimate regression, or that the regression was caused by this change.
The try push for which you've linked the links in comment 10 seem to show an additional 0.88% performance regression for raptor-tp6-amazon-firefox-cold opt webrender fcp, which is hard to believe too.
Updated•5 years ago
|
Updated•5 years ago
|
Reporter | ||
Comment 13•5 years ago
|
||
As the noise level is very high and the alert i close to the 2% minimum value of the alerts and we can't prove 100% that this is the culprit I would mark this as a won't fix.
I can't say for sure and prove it with backup data (because of the high noise level) that this or other commits caused this regression.
Updated•5 years ago
|
Updated•5 years ago
|
Updated•5 years ago
|
Description
•