0.02% Base Content JS (Linux) regression on Fri August 9 2024
Categories
(Remote Protocol :: WebDriver BiDi, defect)
Tracking
(firefox-esr115 unaffected, firefox-esr128 unaffected, firefox129 unaffected, firefox130 unaffected, firefox131 affected)
Tracking | Status | |
---|---|---|
firefox-esr115 | --- | unaffected |
firefox-esr128 | --- | unaffected |
firefox129 | --- | unaffected |
firefox130 | --- | unaffected |
firefox131 | --- | affected |
People
(Reporter: intermittent-bug-filer, Unassigned)
References
(Regression)
Details
(Keywords: perf, perf-alert, regression)
Perfherder has detected a awsy performance regression from push fbb6ff60d0addfe0f1c5e713b457118e3ab95c07. As author of one of the patches included in that push, we need your help to address this regression.
Improvements:
Ratio | Test | Platform | Options | Absolute values (old vs new) |
---|---|---|---|---|
0.02% | Base Content JS | linux1804-64-shippable-qr | fission | 1,508,323.33 -> 1,507,984.00 |
Details of the alert can be found in the alert summary, including links to graphs and comparisons for each of the affected tests. Please follow our guide to handling regression bugs and let us know your plans within 3 business days, or the patch(es) may be backed out in accordance with our regression policy.
If you need the profiling jobs you can trigger them yourself from treeherder job view or ask a sheriff to do that for you.
You can run these tests on try with ./mach try perf --alert 1730
For more information on performance sheriffing please see our FAQ.
Comment 1•2 months ago
|
||
Just to clarify, the comment 0 for this bug states incorrectly that this is an improvement, when it's actually a 0.02% regression.
Comment 2•2 months ago
|
||
Set release status flags based on info from the regressing bug 1911382
Comment 3•2 months ago
|
||
Interesting, the only thing that changed is that we access a getter. Looking at it more closely, I think this getter can be a bit optimized, because it should normally trigger the same code paths as before, but right now it uses slightly different APIs for no reason.
Comment 4•2 months ago
|
||
Let's see if improving the getter helps here: https://perf.compare/compare-results?baseRev=bcc0a029287b0f946d38da235e8e646f94240aee&baseRepo=try&newRev=5e2c4667d627a48c30abe9f7529ac3264b5d06d5&newRepo=try&framework=4
Comment 5•2 months ago
•
|
||
That seems to slightly help, but hard to say with such a small regression...
I will try to backout my patch to see if that makes a clearer impact.
Same compare with a backout: https://perf.compare/compare-results?baseRev=bcc0a029287b0f946d38da235e8e646f94240aee&baseRepo=try&newRev=d18b981727bf4068875511f39e4d994c375487f3&newRepo=try&framework=4
Comment 6•2 months ago
|
||
Is it really from your patch? I see that the numbers went down again on August 19th. Also regarding the regressor I see those changesets that might have caused it?
Comment 7•2 months ago
|
||
(In reply to Henrik Skupin [:whimboo][⌚️UTC+2] from comment #6)
Is it really from your patch? I see that the numbers went down again on August 19th. Also regarding the regressor I see those changesets that might have caused it?
I was confused too, but this bug is for a much older alert from Aug 9th: https://treeherder.mozilla.org/perfherder/graphs?highlightAlerts=1&highlightChangelogData=1&highlightCommonAlerts=0&replicates=0&series=autoland,3806445,1,4&timerange=1209600&zoom=1723216503396,1723217841440,1507355.649088604,1511154.2185003683
The difference is very small (esp compared to that bigger recent regression) but still it made me look at our getter a bit closer. Hard to tell if my patches actually fix anything though, given the delta...
Comment 8•2 months ago
|
||
Hm, ok. There is quite some noise as well. After talking to :sparky it's most likely not worth further investigation when this is a difference of just 250 bytes, which could just be the file size changes of the JS module that we are loading here.
Comment 9•2 months ago
|
||
Sgtm, I can still push my cleanup patch, I think it's worth doing anyway.
Alex: I guess we can accept the regression here? If it's just a module size bump, we can't do much about it.
Comment 10•2 months ago
|
||
It looks like this alert is classified as only an improvement now, and there are no regressions (which is misleading).
A 0.02% change is also below the 0.25% alert threshold setup for this test. If I take the largest value and the smallest value in those ranges, I'm able to hit the 0.25% regression threshold. If I take the smallest values, then I see the 0.02% value so it's likely the noise triggered the alert.
It seems like we did catch a very small regression (~250 bytes or smaller) from this patch but this isn't concerning, and an alert should not have been produced for a 0.02% change either.
:afinder, I agree with :jdescottes on accepting the regression, and closing the bug as WONTFIX.
Updated•2 months ago
|
Comment 11•2 months ago
|
||
Agreed with sparky and jdescottes, we can close it as WONTFIX. I do think we need to investigate the "false improvement" issue here. I'll open a bug for that.
Comment 12•2 months ago
|
||
Thank you all for the feedback. Closing the bug.
Description
•