Closed Bug 1622667 Opened 4 years ago Closed 4 years ago

69.1 - 80.8% build times (android-4-0-armv7-api16, linux64-aarch64, osx-cross) regression on push 9f1416f188e7b3b811cb9071c73afefccc6358f8 (Mon March 9 2020)

Categories

(Core :: DOM: Events, defect)

defect
Not set
normal

Tracking

()

RESOLVED WONTFIX
Tracking Status
firefox75 --- unaffected

People

(Reporter: Bebe, Unassigned)

References

(Regression)

Details

(Keywords: perf-alert, regression)

We have detected a build metrics regression from push:

https://hg.mozilla.org/integration/autoland/pushloghtml?changeset=9f1416f188e7b3b811cb9071c73afefccc6358f8

As author of one of the patches included in that push, we need your help to address this regression.

Regressions:

81% build times linux64-aarch64 opt taskcluster-c5.4xlarge 715.86 -> 1,294.29
75% build times android-4-0-armv7-api16 debug taskcluster-m5.4xlarge 848.08 -> 1,487.06
70% build times android-4-0-armv7-api16 opt taskcluster-c5.4xlarge 751.38 -> 1,278.25
69% build times osx-cross debug taskcluster-m5.4xlarge 1,036.83 -> 1,753.22

You can find links to graphs and comparison views for each of the above tests at: https://treeherder.mozilla.org/perf.html#/alerts?id=25277

On the page above you can see an alert for each affected platform as well as a link to a graph showing the history of scores for this test. There is also a link to a treeherder page showing the jobs in a pushlog format.

To learn more about the regressing test(s), please see: https://developer.mozilla.org/en-US/docs/Mozilla/Performance/Automated_Performance_Testing_and_Sheriffing/Build_Metrics

*** Please let us know your plans within 3 business days, or the offending patch(es) will be backed out! ***

Component: Performance → DOM: Events
Flags: needinfo?(masayuki)
Product: Testing → Core
Version: Version 3 → unspecified

:dmajor do you think this could be caused by some other commit from last week?

Flags: needinfo?(dmajor)

I think this was a one-time sccache invalidation from bug 1619461. If we look at the mac chart there is a high blip from my patch even before perfherder decided to mark the alert -- so I think it's safe to say that Masayuki is not to blame.

Looking at all four charts up to today, it looks like the caches are now repopulated and builds are back to normal. Both the ideal time (lowest value) and the overall ranges are back to previous values.

Flags: needinfo?(masayuki)
Flags: needinfo?(dmajor)
Component: DOM: Events → Performance

Based on comment 2, this sounds like this bug can be closed. I'm moving this back to DOM: Events from Performance so it's on the DOM team's radar to be safe though; I'm hesitant to close another team's bug prematurely if there's still work to be done here. :)

Component: Performance → DOM: Events

Masayuki, I'll let you determine the priority of this.

Flags: needinfo?(masayuki)

In my understanding, this is not a bug of DOM (according to comment 2 and comment 3), but I'm not sure. Does the score permanently has the damage? (The fix was risky, but it's a simple change from point of view of C++ code.)

Flags: needinfo?(masayuki) → needinfo?(fstrugariu)

Looking at the graphs the only platform that would cause concerns is OSX.

https://treeherder.mozilla.org/perf.html#/graphs?highlightAlerts=1&series=autoland,1621267,1,2&timerange=7776000&zoom=1577539865000,1585281414000,0,6527.3940855132205

But that also returned to normal in the last 2 days... Not sure if it's related with this issue.
Based on Comment 3 I think we can close this as WONTFIX

Flags: needinfo?(fstrugariu)
Status: NEW → RESOLVED
Closed: 4 years ago
Resolution: --- → WONTFIX
Has Regression Range: --- → yes
You need to log in before you can comment on or make changes to this bug.