Closed Bug 1139019 Opened 5 years ago Closed 5 years ago

2.11% MacOS 10.8 kraken regression on Mozilla-Inbound (v.39) on February 25, 2015 from push 65e87c0b6aef


(Testing :: Talos, defect)

Not set


(Not tracked)



(Reporter: vaibhav1994, Unassigned)



(Keywords: perf, regression, Whiteboard: [talos_regression])

Talos has detected a Firefox performance regression from your commit 65e87c0b6aef.  We need you to address this regression.

This is a list of all known regressions and improvements related to your bug:

On the page above you can see Talos alert for each affected platform as well as a link to a graph showing the history of scores for this test. There is also a link to a treeherder page showing the Talos jobs in a pushlog format.

To learn more about the regressing test, please see:

Reproducing and debugging the regression:
If you would like to re-run this Talos test on a potential fix, use try with the following syntax:
try: -b o -p macosx64 -u none -t dromaeojs  # add "mozharness: --spsProfile" to generate profile data

To run the test locally and do a more in-depth investigation, first set up a local Talos environment:

Then run the following command from the directory where you set up Talos:
talos --develop -e <path>/firefox -a kraken

Making a decision:
As the patch author we need your feedback to help us handle this regression.

Our wiki page oulines the common responses and expectations:
Summary: 2.11% MacOS 10.8 kraken regression on Mozilla-Inbound on February 25, 2015 from push 65e87c0b6aef → 2.11% MacOS 10.8 kraken regression on Mozilla-Inbound (v.39) on February 25, 2015 from push 65e87c0b6aef
the build was broken for the set of changes here:

I think bhacket is probably responsible, but dvander could help debug this as his change caused the confusion.

We need to know determine which patch is responsible for the change and if there is anything we should do.  This is a small regression on a single platform, so we shouldn't invest tens of hours into fixing this.
Flags: needinfo?(dvander)
Flags: needinfo?(bhackett1024)
My checkin wouldn't have caused this.
Flags: needinfo?(dvander)
I highly doubt Bug 897031 could have caused this. I'm fine with that patch being backed out to exonerate it.
Same for bug 1135912 and bug 961887 - very unlikely.
:bhackett might have a better idea:)
This is probably bug 826741 (changing our default register allocator), it looks like the new allocator is a bit slower than lsra on one of the kraken tests.  This isn't cause to back out bug 826741 as that bug unblocks a fair amount of other work, but I'll try to look at what the problem is in the next few weeks.
Flags: needinfo?(bhackett1024)
Thanks Brian.  I agree there is no need to backout or make this fix a P1.  If you need help running kraken locally or talos locally or on try server, do let me know.
Keywords: perf, regression
Whiteboard: [talos_regression]
No longer blocks: 961887
Brian, can you comment on the status of this?  We are uplifting today to Aurora- if this is not something you can work on or fix without a lot of work, please close this as wontfix.
Flags: needinfo?(bhackett1024)
This regression isn't going to be fixed in this release.  There are some backtracking allocator improvements coming, mainly bug 1067610, but work on this allocator is currently blocked on removal of our old allocator, LSRA.
Closed: 5 years ago
Flags: needinfo?(bhackett1024)
Resolution: --- → WONTFIX
You need to log in before you can comment on or make changes to this bug.