Closed Bug 1812303 Opened 1 year ago Closed 1 year ago

90.86 - 17.66% sina LastVisualChange / sina loadtime + 7 more (Android) regression on Fri January 20 2023


(Core :: DOM: Networking, defect, P2)

Firefox 111



Tracking Status
firefox-esr102 --- unaffected
firefox109 --- unaffected
firefox110 --- unaffected
firefox111 - disabled
firefox112 --- affected


(Reporter: alexandrui, Assigned: edenchuang)




(Keywords: perf, perf-alert, regression, Whiteboard: [necko-triaged])

Perfherder has detected a browsertime performance regression from push 0c260a33541a67b1c4f0ab83a1d1380f466fcf9b. As author of one of the patches included in that push, we need your help to address this regression.


Ratio Test Platform Options Absolute values (old vs new)
91% sina LastVisualChange android-hw-a51-11-0-aarch64-shippable-qr cold webrender 3,254.25 -> 6,210.92
90% sina PerceptualSpeedIndex android-hw-a51-11-0-aarch64-shippable-qr cold webrender 2,735.58 -> 5,207.17
89% sina FirstVisualChange android-hw-a51-11-0-aarch64-shippable-qr cold webrender 2,256.56 -> 4,256.92
83% sina fcp android-hw-a51-11-0-aarch64-shippable-qr cold webrender 2,295.94 -> 4,211.29
75% sina SpeedIndex android-hw-a51-11-0-aarch64-shippable-qr cold webrender 3,045.00 -> 5,330.42
71% sina ContentfulSpeedIndex android-hw-a51-11-0-aarch64-shippable-qr cold webrender 2,816.50 -> 4,829.25
54% sina loadtime android-hw-a51-11-0-aarch64-shippable-qr warm webrender 1,140.41 -> 1,754.54
20% sina ContentfulSpeedIndex android-hw-a51-11-0-aarch64-shippable-qr warm webrender 287.92 -> 346.17
18% sina loadtime android-hw-a51-11-0-aarch64-shippable-qr cold webrender 7,982.31 -> 9,391.75


Ratio Test Platform Options Absolute values (old vs new)
44% sina FirstVisualChange android-hw-a51-11-0-aarch64-shippable-qr cold webrender 4,346.08 -> 2,413.92
34% sina loadtime android-hw-a51-11-0-aarch64-shippable-qr warm webrender 1,738.40 -> 1,153.69

Details of the alert can be found in the alert summary, including links to graphs and comparisons for each of the affected tests. Please follow our guide to handling regression bugs and let us know your plans within 3 business days, or the offending patch(es) may be backed out in accordance with our regression policy.

If you need the profiling jobs you can trigger them yourself from treeherder job view or ask a sheriff to do that for you.

For more information on performance sheriffing please see our FAQ.

Flags: needinfo?(echuang)

Set release status flags based on info from the regressing bug 1351231

The bug is marked as tracked for firefox111 (nightly). We have limited time to fix this, the soft freeze is in 14 days. However, the bug still isn't assigned.

:ghess, could you please find an assignee for this tracked bug? Given that it is a regression and we know the cause, we could also simply backout the regressor. If you disagree with the tracking decision, please talk with the release managers.

For more information, please visit auto_nag documentation.

Flags: needinfo?(ghess)

PFetch (dom.workers.pFetch.enabled) is currently enabled in nightly build only.

According to the bug description, it seems that PFetch causes some performance issues on the android platform.
I will take a look these days to figure the root cause out.

Since PFetch would not be turned on in Beta and release, we still want to keep it enabled in nightly.

Assignee: nobody → echuang
Flags: needinfo?(ghess)
Flags: needinfo?(echuang)
Severity: -- → S3
Priority: -- → P2
Whiteboard: [necko-triaged]

I am wondering if this performance regression is really related to bug 1351231.

The new time values with the patches in the bug description are almost the same with results before Jan 19.,4375260,1,13

The interesting part is most of regression test cases only got better results in Jan 19 and Jan 20.

To make sure if this really related to bug 1351231, I pushed a try run which force disable PFetch to see if we get the same result or not.

(In reply to Eden Chuang[:edenchuang] from comment #4)

To make sure if this really related to bug 1351231, I pushed a try run which force disable PFetch to see if we get the same result or not.

One piece of guidance I've received is that comparing metrics to m-c from a try branch push can be deceptive and it's better to use mach try perf which can launch try performance runs for both with and without your try stack, which should be more representative.

:edenchuang this is the final week of 111 nightly, 111 goes to beta next week.
bug 1351231 only impact nightly builds, but there was some doubt in Comment 4 that this was the regressor.
Any updates on the investigation?

Flags: needinfo?(echuang)

Hi Donal

After running some perf try runs w/o enabling PFetch(bug 1351231), the results have no big difference(almost less than 5%).
So I think this regression might not be caused by bug 1351231.
I am trying to run perf try runs again by not only disabling but removing the patches of bug 1351231 to ensure this is not related to bug 1351231.
I will update the results on the bug later.

Flags: needinfo?(echuang) → needinfo?(dmeehan)
Flags: needinfo?(dmeehan)

Set release status flags based on info from the regressing bug 1351231

:edenchuang 111 is now in beta, any updates from comment 7?
Since Bug 1351231 only impacts nightly, I would set firefox111 to disabled and remove tracking. Though would need to confirm your investigation first.

After investigating many try perf runs, I think this is not a performance regression caused by PFetch.

Following is one of try perf runs that compares enabling PFetch(old) and disabling PFetch(new).

Focus on the regression case sina, and there is no significant difference between enable and disable.
In some significant cases, I always found that enabling (old) has better results than disabling (new).

So I think we can set firefox 111 to disabled and remove tracking

Flags: needinfo?(dmeehan)

I'm closing this as invalid, thanks :edenchuang.

Closed: 1 year ago
Resolution: --- → INVALID
Flags: needinfo?(dmeehan)
You need to log in before you can comment on or make changes to this bug.