We have detected a build metrics regression from push 480febb613cdc8d68abd4d85d199f1a0ae26a08f. As author of one of the patches included in that push, we need your help to address this regression. Regressions: 158% build times summary android-4-0-armv7-api15 opt taskcluster-c4.2xlarge 637.46 -> 1644.32 149% build times summary android-api-15-gradle opt taskcluster-c4.2xlarge 662.48 -> 1649.41 133% build times summary android-4-0-armv7-api15 debug taskcluster-c4.2xlarge 810.74 -> 1889.98 127% build times summary android-4-0-armv7-api15 opt taskcluster-m4.2xlarge 790.64 -> 1791.25 126% build times summary android-4-2-x86 opt taskcluster-c4.2xlarge 749.36 -> 1694.69 121% build times summary android-4-0-armv7-api15 debug taskcluster-m4.2xlarge 947.72 -> 2093.19 118% build times summary android-4-2-x86 opt taskcluster-m4.2xlarge 847.1 -> 1845.21 73% build times summary linux64-stylo debug taskcluster-m4.4xlarge 841.6 -> 1451.82 65% build times summary linux32 debug taskcluster-m4.4xlarge 907.72 -> 1500.52 62% build times summary linux32 opt taskcluster-m4.4xlarge 798.07 -> 1294.45 58% build times summary linux32 debug taskcluster-c4.4xlarge 835.55 -> 1322.55 55% build times summary linux64 opt taskcluster-c4.4xlarge 721.94 -> 1120.86 52% build times summary linux64 opt taskcluster-m4.4xlarge 818.67 -> 1248.28 52% build times summary linux64 debug taskcluster-c4.4xlarge 823.12 -> 1252.94 51% build times summary linux64-stylo debug taskcluster-c4.4xlarge 863.07 -> 1305.05 51% build times summary linux32 opt taskcluster-c4.4xlarge 768.18 -> 1158.87 28% build times summary linux64-stylo opt taskcluster-m4.4xlarge 1327.95 -> 1694.14 21% build times summary linux64-stylo opt taskcluster-c4.4xlarge 1211.16 -> 1470.91 You can find links to graphs and comparison views for each of the above tests at: https://treeherder.mozilla.org/perf.html#/alerts?id=5588 On the page above you can see an alert for each affected platform as well as a link to a graph showing the history of scores for this test. There is also a link to a treeherder page showing the jobs in a pushlog format. To learn more about the regressing test(s), please see: https://developer.mozilla.org/en-US/docs/Mozilla/Performance/Automated_Performance_Testing_and_Sheriffing/Build_Metrics
Component: Untriaged → Build Config
Product: Firefox → Core
...? Enabling RUST_BACKTRACE is so expensive?! If that is really the reason, fixing is easy, just back that out. But I suspect whether it really is because of that. Manish, what do you think?
(In reply to Xidorn Quan [:xidorn] UTC+10 (less responsive 15/Apr-3/May) from comment #1) > ...? Enabling RUST_BACKTRACE is so expensive?! Yeah, I do not believe that at all; RUST_BACKTRACE doesn't even get examined in rustc unless something goes wrong. Maybe that pushes the environment over some size limit, or some other trickiness? The graph is pretty convincing that something is going wrong with that push, though. This one is for opt x86-64 Linux: https://treeherder.mozilla.org/perf.html#/graphs?timerange=2592000&series=%5Bmozilla-inbound,4044b74c437dfc672f4615a746ea01f6e4c0312d,1,2%5D&zoom=1490052803570.8274,1490266955366.0588,600,1251.3011152416357&selected=%5Bmozilla-inbound,4044b74c437dfc672f4615a746ea01f6e4c0312d,183956,85450267,2%5D Whatever happened, it's definitely persistent.
No, this is just a dupe of bug 1350093. The actual change that broke this was a docker-worker change in taskcluster, which unfortunately doesn't show up in the pushlog. :-(
Status: NEW → RESOLVED
Closed: 2 years ago
Resolution: --- → DUPLICATE
Duplicate of bug: 1350093
Thanks Ted, for figuring this issue out.
As of this moment, the fixes started to show up for the alerts on https://treeherder.mozilla.org/perf.html#/alerts?id=5588
Component: Build Config → Build Config & IDE Support
Product: Core → Firefox for Android
You need to log in before you can comment on or make changes to this bug.