Closed Bug 1356652 Opened 7 years ago Closed 7 years ago

Consider updating our PGO training set

Categories

(Firefox Build System :: General, enhancement)

enhancement
Not set
normal

Tracking

(Performance Impact:high, firefox57 fixed)

RESOLVED FIXED
mozilla57
Performance Impact high
Tracking Status
firefox57 --- fixed

People

(Reporter: away, Assigned: away)

References

(Blocks 1 open bug)

Details

Attachments

(5 files, 12 obsolete files)

59 bytes, text/x-review-board-request
glandium
: review+
Details
59 bytes, text/x-review-board-request
ted
: review+
Details
59 bytes, text/x-review-board-request
Details
59 bytes, text/x-review-board-request
ted
: review+
ehsan.akhgari
: review+
Details
59 bytes, text/x-review-board-request
ted
: review+
Details
Currently our PGO training consists of little more than a few sunspider pages. According to the log at https://hg.mozilla.org/mozilla-central/log/tip/build/pgo/index.html, our PGO training list hasn't seen a substantial update since 2009, although there was a short-lived attempt in bug 1096280 that was backed out.

Web usage patterns change over time. Should we look into updating our training set?  Should (can?) we include offline copies of 2017 web content, particularly pages that we care about for Quantum?
This has come up various times over the years. Every time, Ted has some useful context to add. So needinfo him to record it here.

My opinion is we should experiment with feeding more activities into the PGO profiler. It's certainly easy enough to experiment with Try pushes these days. Perhaps we could see what Chrome does and use that to seed a new training run?
Flags: needinfo?(ted)
The only way I can think of where we could pull offline copies of recent content would be to pull the tp5 archive (or something like it but fresher ; btw, maybe it's time to refresh that too) at pgo-time.
We have a static copy of Facebook, FWIW.  :-)
Whiteboard: [qf] → [qf:p3]
We should probably do https://bugzilla.mozilla.org/show_bug.cgi?id=1356654 first. Less risk to move to VS2017 than playing with PGO.
We've tried it in the past, bug 1096280 is the most notable example. You are certainly welcome to stuff whatever you want in there and see what it does! It's possible there are things that are not being well-optimized that could be. However--it's always a balancing act. The whole point of PGO is to optimize hot functions as much as possible, and get cold functions out of the way. You can't have *everything* be hot.

When last I looked at it, the code that the MSVC PGO optimizer decided to optimize it took extreme measures to do so. Multiple levels of inlining, partial function inlining, speculative virtual call inlining, etc. It will even segregate functions into hot and cold blocks and place them on separate pages so the hot code can stay in cache. I think most of the things in the codebase that benefit from these kinds of optimizations are the things that are widely used like refcounting and low-level XPCOM methods, and simply starting the browser is enough to hit all of those codepaths enough times that they get optimized really well.

All that being said, there may be things that are important for real-world use cases that are not being covered in our profiling scenario. It is possible to inspect the profiling data in the PGD files. Microsoft ships a `pgomgr` tool for this:
https://msdn.microsoft.com/en-us/library/2kw46d8w.aspx

You could easily make our PGO builds upload the PGD files they produce and look at what functions are being heavily optimized. Before we had a 64-bit link.exe available I used pgomgr and some Python scripts to figure out what bits of code we could disable PGO on to keep the linker's memory usage low enough to succeed:
https://hg.mozilla.org/users/tmielczarek_mozilla.com/pgo-datamangling/file/tip/pgosummary.py
Flags: needinfo?(ted)
On my machine, adding Speedometer to our PGO training increases our score by 3-4% (64-bit) and 5-6% (32-bit). I'll start some try builds for more reliable numbers.
Whiteboard: [qf:p3] → [qf]
overholt, could you try these on the reference hardware please?

The training is based on SM trunk from yesterday, and that tree has seen some large updates to its frameworks lately, so for the best apples-to-apples comparison it would be great if you could test on a very recent copy of trunk.

Before (32-bit): https://queue.taskcluster.net/v1/task/JDrzVVV8QQ2CyhAxq5v-Dg/runs/0/artifacts/public/build/target.zip
After (32-bit): https://queue.taskcluster.net/v1/task/UtbduQBdQy24EHJpNZC9KQ/runs/0/artifacts/public/build/target.zip

Before (64-bit): https://queue.taskcluster.net/v1/task/dxi-O6CUTpW2dETvoZJz6A/runs/0/artifacts/public/build/target.zip
After (64-bit): https://queue.taskcluster.net/v1/task/ISzOHBVTTgSyWUZWpWNTPQ/runs/0/artifacts/public/build/target.zip
Flags: needinfo?(overholt)
I did 2 runs each using https://mozilla.github.io/arewefastyet-speedometer/2.0/ and got:

(In reply to David Major [:dmajor] from comment #8)
> 
> Before (32-bit):
> https://queue.taskcluster.net/v1/task/JDrzVVV8QQ2CyhAxq5v-Dg/runs/0/
> artifacts/public/build/target.zip

65.34
80.13

> After (32-bit):
> https://queue.taskcluster.net/v1/task/UtbduQBdQy24EHJpNZC9KQ/runs/0/
> artifacts/public/build/target.zip

83.30
82.43

> Before (64-bit):
> https://queue.taskcluster.net/v1/task/dxi-O6CUTpW2dETvoZJz6A/runs/0/
> artifacts/public/build/target.zip

65.01
75.50

> After (64-bit):
> https://queue.taskcluster.net/v1/task/ISzOHBVTTgSyWUZWpWNTPQ/runs/0/
> artifacts/public/build/target.zip

77.96
77.58

Looks like a 2.87% improvement on 32 bit and a 2.75% improvement on 64 bit!
Flags: needinfo?(overholt)
(In reply to Andrew Overholt [:overholt] from comment #9)
> I did 2 runs each using
> https://mozilla.github.io/arewefastyet-speedometer/2.0/ and got:

Thanks! FTR that site currently says revision 6049c7c which I believe this matches the test suite that I used: https://hg.mozilla.org/try/rev/2e713de4a091 (no changes in PerformanceTests/Speedometer/ between the two revisions)
(In reply to Andrew Overholt [:overholt] from comment #9)
> I did 2 runs each using
> https://mozilla.github.io/arewefastyet-speedometer/2.0/ and got:
> 
> (In reply to David Major [:dmajor] from comment #8) 
> > After (64-bit):
> > https://queue.taskcluster.net/v1/task/ISzOHBVTTgSyWUZWpWNTPQ/runs/0/
> > artifacts/public/build/target.zip
> 
> 77.96
> 77.58

I ran this "After (64-bit)" twice more and got:

78.53
78.06
A few more runs confirm the "After (64-bit)" build consistently gets just over 78 on this Acer laptop.
https://treeherder.mozilla.org/perf.html#/compare?originalProject=try&originalRevision=dd9f4fcb127e&newProject=try&newRevision=1010393bfc43&framework=1&showOnlyImportant=0&showOnlyConfident=1

Talos looks good, with some likely improvements, and one potential regression on "tp6_youtube opt e10s stylo" -- but the history for that test shows it to be pretty noisy.

Note that if you're looking at the history graphs, for a fair comparison you'll want to add "pgo" series and ignore the "opt" ones. Out of old habit, I had pushed these as "opt" jobs but with PGO enabled via mozconfig, so both the baseline and test builds will seem out of place compared to most opt pushes.
At this point the big question is whether we should try to get this into 57.

Ehsan expressed some concerns on IRC about the risk of training changes uncovering PGO compiler bugs late in the cycle. Flagging for comment in case he wants to elaborate further.
Flags: needinfo?(ehsan)
IIRC PGO training data was described as made of very old websites that do not represent the current state of the Web.
If that's the case, may there also be an additional value in removing obsolete items from the training data?
(In reply to Zibi Braniecki [:gandalf][:zibi] from comment #15)
> IIRC PGO training data was described as made of very old websites that do
> not represent the current state of the Web.

It's currently Sunspider plus a couple of other pages.

> If that's the case, may there also be an additional value in removing
> obsolete items from the training data?

My gut feeling, without evidence, is that we are nowhere near the point where extraneous training is harmful.
(In reply to David Major [:dmajor] from comment #14)
> Ehsan expressed some concerns on IRC about the risk of training changes
> uncovering PGO compiler bugs late in the cycle. Flagging for comment in case
> he wants to elaborate further.

From my perspective, we have 4 weeks left in the Nightly cycle plus all of Beta for any issues to flush out. Given that we're spending loads of engineering time at the moment working on micro-optimizations trying to eek out fractions of a percent gains in our SM2 scores, a 3% gimme win sounds pretty hard to pass up over what's basically a hypothetical concern at the moment.

Also, I looked at your Try pushes and didn't see anything obvious broken in the pushes that ran Windows tests either.
We had a discussion about what to do about this with dmajor and naveed on #flow right now.  Here is a summary.

The way I see it from a technical standpoint we have two options in front of us:
 1. Defer doing this until early in the 58 cycle.
 2. Try to get this into the 57 Nightly as early as possible and deal with any fallout.

Backing out this change should be relatively simple, even on 57 Beta, so both options are viable.  This sounds more of a choice in where we spend engineering resources when.  I deferred that choice to Naveed.  Given that we still have around 4 weeks left on the 57 Nightly cycle and we don't have a lot of other work planned that can give us this much benefit on Speedometer, Naveed's recommendation was to try to see if we can land this by the end of this week and give it a shot for 57 in that case.  If not, defer to 58.  If we do manage to get it landed in time for 57, we should probably notify Marcia's team to watch for changes in the Nightly crash rate on Windows as potential fallout of this.
Flags: needinfo?(ehsan)
Assignee: nobody → dmajor
Please consider this needinfo to be my review request for "Part 1 - Import Speedometer tree", since the patch is too large to attach in Bugzilla.

Here's the changeset on tryserver: https://hg.mozilla.org/try/rev/2e713de4a091

This is solely |hg add| of Speedometer files; Mozilla-specific changes will be in subsequent patches.

Gerv, any concerns license-wise about pulling in this subdirectory from the Webkit repo? We'll be using this code in our build process but not shipping it. Also there are a number of JS framework files in here that may be licensed differently from Webkit itself.
Flags: needinfo?(ted)
Flags: needinfo?(gerv)
Attachment #8900866 - Flags: review?(ted)
Attached patch Part 3: Start the tests onload (obsolete) — Splinter Review
Attachment #8900867 - Flags: review?(ted)
Shield your eyes, this timeout hack is horrible and needs to be re-written more nicely. Any preference?

90 seconds ought to let each subtest run once on the super-slow instrumented build, with some breathing room.

While I was watching my build, I noticed that the existing tests were not always fully loading within the 1000ms timeout on my machine (and presumably automation is even slower) so I doubled the interval for good measure.

In total this should add about 2 minutes to our PGO builds. While that's not great, it doesn't seem terrible either.
Attachment #8900872 - Flags: review?(ted)
Attachment #8900866 - Flags: review?(ted) → review+
Comment on attachment 8900872 [details] [diff] [review]
Part 4: Add Speedometer to the training list

Review of attachment 8900872 [details] [diff] [review]:
-----------------------------------------------------------------

Does the speedometer code have any events it fires or anything like that we could use to determine when it's finished running? This code has always been hacky, but it would be nice if we didn't add unnecessary time into our already-slow builds. I wouldn't block landing this on fixing that, but if there's nothing obvious can you file a followup to look into it?
Attachment #8900872 - Flags: review?(ted) → review+
Could you add a note somewhere, maybe in README_MOZILLA, that gives the exact command you used to export these files from the webkit repo?
(In reply to David Major [:dmajor] from comment #19)
> Gerv, any concerns license-wise about pulling in this subdirectory from the
> Webkit repo? We'll be using this code in our build process but not shipping
> it. Also there are a number of JS framework files in here that may be
> licensed differently from Webkit itself.

Might you be able to be a bit more specific, rather than asking me to go spelunking for licenses in a big checkin? :-)

To make some general comments, we can take code from Webkit under their standard licenses, and if it's not being shipped then no about:license update is necessary, but we need to make sure we bring across appropriate licensing information both for the top level directory and any subdirs which are under different licenses. That probably happens for subdirs just by bringing the files because the Webkit team will have arranged that, but we need to make sure there's something in the top-level directory which makes the "default" licensing of stuff underneath it clear.

Gerv
Flags: needinfo?(gerv)
Comment on attachment 8900867 [details] [diff] [review]
Part 3: Start the tests onload

Review of attachment 8900867 [details] [diff] [review]:
-----------------------------------------------------------------

It might be worthwhile to see if you can upstream a patch to add a query string to automatically run the tests (and maybe an event or something for when they finish).
Attachment #8900867 - Flags: review?(ted) → review+
Flags: needinfo?(ted)
(In reply to Ted Mielczarek [:ted.mielczarek] from comment #23)
> Does the speedometer code have any events it fires or anything like that we
> could use to determine when it's finished running? This code has always been
> hacky, but it would be nice if we didn't add unnecessary time into our
> already-slow builds. I wouldn't block landing this on fixing that, but if
> there's nothing obvious can you file a followup to look into it?

We won't come anywhere close to finishing the test since it repeats everything 20 times. For PGO I'm willing to accept just one iteration of the outer loop.
(In reply to Gervase Markham [:gerv] from comment #25)
> (In reply to David Major [:dmajor] from comment #19)
> > Gerv, any concerns license-wise about pulling in this subdirectory from the
> > Webkit repo? We'll be using this code in our build process but not shipping
> > it. Also there are a number of JS framework files in here that may be
> > licensed differently from Webkit itself.
> 
> Might you be able to be a bit more specific, rather than asking me to go
> spelunking for licenses in a big checkin? :-)

Sorry; I haven't looked at every last file myself, so it's spelunking for me too. At a glance, I see a bunch of MIT and Apache v2 licenses, but there may be others. Also, there's a copy of React in there, if that matters.

> To make some general comments, we can take code from Webkit under their
> standard licenses, and if it's not being shipped then no about:license
> update is necessary, but we need to make sure we bring across appropriate
> licensing information both for the top level directory and any subdirs which
> are under different licenses. That probably happens for subdirs just by
> bringing the files because the Webkit team will have arranged that, but we
> need to make sure there's something in the top-level directory which makes
> the "default" licensing of stuff underneath it clear.

Would it be sufficient to save the contents of https://webkit.org/licensing-webkit/ into a LICENSE file at the top of the imported subdirectory?
Flags: needinfo?(gerv)
(In reply to David Major [:dmajor] from comment #28)
> (In reply to Gervase Markham [:gerv] from comment #25)
> > (In reply to David Major [:dmajor] from comment #19)
> > > Gerv, any concerns license-wise about pulling in this subdirectory from the
> > > Webkit repo? We'll be using this code in our build process but not shipping
> > > it. Also there are a number of JS framework files in here that may be
> > > licensed differently from Webkit itself.
> > 
> > Might you be able to be a bit more specific, rather than asking me to go
> > spelunking for licenses in a big checkin? :-)
> 
> Sorry; I haven't looked at every last file myself, so it's spelunking for me
> too. At a glance, I see a bunch of MIT and Apache v2 licenses, but there may
> be others. Also, there's a copy of React in there, if that matters.

To make it easier you here is what I did:

$ cd WebKit
$ git pull
$ cd PerformanceTests/Speedometer/resources
$ find . | grep -i license
./flightjs-example-app/components/jasmine-flight/LICENSE.md

MIT

./flightjs-example-app/components/es5-shim/LICENSE

MIT

./flightjs-example-app/LICENSE.md

MIT

./todomvc/dependency-examples/flight/flight/node_modules/flight/LICENSE

MIT

./todomvc/dependency-examples/flight/flight/node_modules/requirejs-text/LICENSE

MIT and CC0

./todomvc/dependency-examples/flight/flight/node_modules/es5-shim/LICENSE

MIT

./todomvc/labs/architecture-examples/react/bower_components/director/LICENSE

MIT

./todomvc/architecture-examples/jquery/node_modules/director/LICENSE

MIT

./todomvc/architecture-examples/react/node_modules/react-dom/LICENSE

BSD

./todomvc/architecture-examples/react/node_modules/classnames/LICENSE

MIT

./todomvc/architecture-examples/react/node_modules/react/LICENSE

BSD

./todomvc/architecture-examples/react/node_modules/director/LICENSE

MIT

./todomvc/architecture-examples/react/license.md

MIT

./todomvc/license.md

MIT

> > To make some general comments, we can take code from Webkit under their
> > standard licenses, and if it's not being shipped then no about:license
> > update is necessary, but we need to make sure we bring across appropriate
> > licensing information both for the top level directory and any subdirs which
> > are under different licenses. That probably happens for subdirs just by
> > bringing the files because the Webkit team will have arranged that, but we
> > need to make sure there's something in the top-level directory which makes
> > the "default" licensing of stuff underneath it clear.
> 
> Would it be sufficient to save the contents of
> https://webkit.org/licensing-webkit/ into a LICENSE file at the top of the
> imported subdirectory?

FWIW I reached out to Addy to ask about the license: https://twitter.com/ehsanakhgari/status/900838327032705024
Can we put the third party code in third_party/, please? build/ is supposed to be reserved for things core to the build system. I also like having a "firewall" of sorts between Mozilla code and non-Mozilla code. It makes things much easier to reason about.
(In reply to Gregory Szorc [:gps] from comment #30)
> Can we put the third party code in third_party/, please? build/ is supposed
> to be reserved for things core to the build system. I also like having a
> "firewall" of sorts between Mozilla code and non-Mozilla code. It makes
> things much easier to reason about.

I agree with the reasoning, but it's not straightforward, because profileserver.py does:

  httpd = MozHttpd(port=PORT,
                   docroot=os.path.join(build.topsrcdir, "build", "pgo"))

and I'm not particularly keen to rework that as part of this bug.
(In reply to :Ehsan Akhgari (needinfo please, extremely long backlog) from comment #29)
> To make it easier you here is what I did:
> 
> $ cd WebKit
> $ git pull
> $ cd PerformanceTests/Speedometer/resources
> $ find . | grep -i license

A bunch of files also have inline license blocks as comments.
(In reply to David Major [:dmajor] from comment #31)
> I agree with the reasoning, but it's not straightforward, because
> profileserver.py does:
> 
>   httpd = MozHttpd(port=PORT,
>                    docroot=os.path.join(build.topsrcdir, "build", "pgo"))
> 
> and I'm not particularly keen to rework that as part of this bug.

Scope bloat sucks. So I made this easier for you. After the series I just posted, it is a one-liner to make any random directory available to the HTTP server. Hopefully this facilitates other PGO experimentation beyond this bug!
One other reminder - please ensure that any changes you make are tested with Valgrind builds on Try since they use the same pageset for their runs.
Thanks for the profileserver changes, but I wish they would have been part of a separate bug blocking this one.
Depends on: 1393657
Attachment #8900989 - Attachment is obsolete: true
Attachment #8900989 - Flags: review?(mh+mozilla)
Attachment #8900990 - Attachment is obsolete: true
Attachment #8900990 - Flags: review?(mh+mozilla)
Attachment #8900991 - Attachment is obsolete: true
Attachment #8900991 - Flags: review?(mh+mozilla)
Attachment #8900992 - Attachment is obsolete: true
Attachment #8900992 - Flags: review?(mh+mozilla)
(In reply to David Major [:dmajor] from comment #39)
> Thanks for the profileserver changes, but I wish they would have been part
> of a separate bug blocking this one.

Sorry about that. Patches are now in bug 1393657.

(Hopefully the transition to Phabricator will allow us to be more flexible about these kinds of scenarios in the future.)
FWIW I really think we shouldn't block this bug on the location of the files in the source tree.  We're on a tight deadline for landing this (see comment 18).  If bug 1393657 lands in time that's great, otherwise I suggest trying to get this landed by tomorrow and moving the code to third_party/ post facto.  Is that OK?
Flags: needinfo?(gps)
Whiteboard: [qf] → [qf:p1]
Note that Mercurial is not git. It doesn't deduplicate data across file renames, which means the repository would, in practice end up with two copies of the whole speedometer test suite. So, not getting the path right at landing time can be a big deal when we're talking about something that doesn't fit in a bugzilla attachment.
I grok the urgency.

glandium has already done the reviews for the file mapping. And it looks like the HTTP server we're using already has a mechanism for mapping paths easily (see review comments). So it is trivial to do this right from the beginning.

I'll be around tomorrow to help with any build peer reviews for this bug. (Ted is on PTO.) I can also author any patches needed for my imposed scope bloat. Ping me on IRC if you need anything. I should be around all day aside from lunch and an appointment before lunch.
Flags: needinfo?(gps)
Another problem with the speedometer files in the Try push is they are using CRLF. The canonical files in the WebKit Git repo are using LF. I'm guessing your Git on Windows is set to normalize line endings on checkout and commit. You probably want to disable that so your VCS tool isn't mangling bytes.
dmajor: I've made an alternate set of commits that imports Speedometer into third_party/speedometer and adds dynamic URI mapping to the HTTP server:

https://hg.mozilla.org/users/gszorc_mozilla.com/firefox/rev/b0adaf4cea53
https://hg.mozilla.org/users/gszorc_mozilla.com/firefox/rev/65bd74740ed6
https://hg.mozilla.org/users/gszorc_mozilla.com/firefox/rev/3ffa39e3fa13

You can grab those via `hg pull -r 3ffa39e3fa13 https://hg.mozilla.org/users/gszorc_mozilla.com/firefox`.
dmajor is out today due to something unexpected, I'll try to fill in for him on this.  I need to start to retrace his steps, currently trying to find out whether the licensing issues is a blocker for landing...
Assignee: dmajor → ehsan
FWIW, I did verify on Try that the commits in comment 45 don't break Linux/Windows PGO & Valgrind jobs either. And it was pushed to Try on top of the commits from bug 1393657, which is already on autoland.

https://treeherder.mozilla.org/#/jobs?repo=try&revision=b621e15bac8a9c1179ca51fb72b8179709950050
From a technical perspective everything here looks great and ready to go but we decided that we don't want to rush this in without a careful examination from a licensing perspective, and unfortunately realistically we won't have time to do that today.  So per comment 18 we decided to punt on this until early 58 cycle.
Assignee: ehsan → dmajor
Considering we aren't actually shipping any of this code, it feels really weird to punt due to licensing issues.  I mean, we could just store it in a separate repo, right?
(In reply to Ben Kelly [:bkelly] from comment #55)
> Considering we aren't actually shipping any of this code, it feels really
> weird to punt due to licensing issues.  I mean, we could just store it in a
> separate repo, right?

Or download a binary blob on our automation jobs, or *something*.
gps is working on that exact solution as we speak.  :-)
Attachment #8901317 - Attachment is obsolete: true
Attachment #8901318 - Attachment is obsolete: true
Attachment #8901318 - Flags: review?(ted)
Attachment #8901320 - Attachment is obsolete: true
Attachment #8901320 - Flags: review?(ted)
Comment on attachment 8901369 [details]
Bug 1356652 - Toolchain task to create a Speedometer archive;

https://reviewboard.mozilla.org/r/172822/#review178206

::: taskcluster/ci/toolchain/linux.yml:215
(Diff revision 1)
> +spedometer:
> +    description: "package speedometer benchmark suite"
> +    treeherder:
> +        kind: build
> +        platform: toolchains/opt
> +        symbol: TL(speed)

Probably better to put it in a different group than TL.

::: taskcluster/scripts/misc/build-speedometer.sh:22
(Diff revision 1)
> +
> +     <section id="home" class="selected">
> +
> +EOF
> +
> +svn co -r ${SPEEDOMETER_REVISION} https://svn.webkit.org/repository/webkit/trunk/PerformanceTests/Speedometer speedometer

Not that it matters /that/ much, but don't we have a copy on github?
Attachment #8901369 - Flags: review?(mh+mozilla) → review+
Comment on attachment 8901369 [details]
Bug 1356652 - Toolchain task to create a Speedometer archive;

https://reviewboard.mozilla.org/r/172822/#review178206

> Not that it matters /that/ much, but don't we have a copy on github?

We /could/ use Git[Hub]. However, then we have to do a full WebKit clone. I was thinking that Subversion's subdirectory checkout would be convenient. But, yeah, Subversion doesn't have SHA-1 protection like Git does. So if you want me to change it for that reason, I can.
Comment on attachment 8901370 [details]
Bug 1356652 - Configure check for speedometer/;

https://reviewboard.mozilla.org/r/172824/#review178208

::: moz.configure:292
(Diff revision 1)
> +def speedometer(build_env, pgo):
> +    p = os.path.join(build_env.topsrcdir, 'speedometer')
> +    if os.path.exists(p):
> +        return
> +
> +    raise FatalCheckError(

This will fail on non pgo builds.

::: moz.configure:294
(Diff revision 1)
> +    if os.path.exists(p):
> +        return
> +
> +    raise FatalCheckError(
> +        'cannot find Speedometer files; a copy of Speedometer must be present '
> +        'in %s; run `taskcluster/scripts/misc/build-speedometer.sh` to produce '

`mach artifact toolchain --from-build speedometer` would be a better command suggestion.
Attachment #8901370 - Flags: review?(mh+mozilla)
Comment on attachment 8901319 [details]
Bug 1356652 - Expose Speedometer to PGO HTTP server;

https://reviewboard.mozilla.org/r/172774/#review178212
Attachment #8901319 - Flags: review?(mh+mozilla) → review+
Please don't block on licensing issues, particularly given we are not shipping the code. There's no risk that we _can't_ check in this code, given that it's open source and from Webkit, we just need to make sure the paperwork is in order, and that can happen a week or a month later with no stress. 

Gerv
Comment on attachment 8901370 [details]
Bug 1356652 - Configure check for speedometer/;

https://reviewboard.mozilla.org/r/172824/#review178208

> This will fail on non pgo builds.

Huh? Doesn't that ``@depends_all()`` protect us here?
Comment on attachment 8901370 [details]
Bug 1356652 - Configure check for speedometer/;

https://reviewboard.mozilla.org/r/172824/#review178208

> Huh? Doesn't that ``@depends_all()`` protect us here?

Sorry, missed it was a depends_all.
(In reply to Gervase Markham [:gerv] from comment #70)
> Please don't block on licensing issues, particularly given we are not
> shipping the code. There's no risk that we _can't_ check in this code, given
> that it's open source and from Webkit, we just need to make sure the
> paperwork is in order, and that can happen a week or a month later with no
> stress. 

Thank you Gerv! \o/
(In reply to Gregory Szorc [:gps] from comment #67)
> Comment on attachment 8901369 [details]
> Bug 1356652 - Toolchain task to create a Speedometer archive;
> 
> https://reviewboard.mozilla.org/r/172822/#review178206
> 
> > Not that it matters /that/ much, but don't we have a copy on github?
> 
> We /could/ use Git[Hub]. However, then we have to do a full WebKit clone. I
> was thinking that Subversion's subdirectory checkout would be convenient.
> But, yeah, Subversion doesn't have SHA-1 protection like Git does. So if you
> want me to change it for that reason, I can.

I was not thinking about the webkit github mirror, but about some repo we have in the mozilla group that exposes speedometer through a github.io url (saw that in passing some time this week, don't remember where it is exactly). Not that it matters much anyways, after comment 70.
Attachment #8901369 - Attachment is obsolete: true
Attachment #8901370 - Attachment is obsolete: true
Comment on attachment 8901321 [details]
Bug 1356652 - Add Speedometer files to the PGO training list;

Reviewed by ted on dmajor's attachment.
Attachment #8901321 - Flags: review?(ted) → review+
Comment on attachment 8901377 [details]
Bug 1356652 - Add a README_MOZILLA file to the Speedometer directory;

Reviewed by Ted on dmajor's attachment.
Attachment #8901377 - Flags: review?(ted) → review+
Comment on attachment 8901378 [details]
Bug 1356652 - Start test on load;

Reviewed by Ted on dmajor's attachment.
Attachment #8901378 - Flags: review?(ted) → review+
Comment on attachment 8901319 [details]
Bug 1356652 - Expose Speedometer to PGO HTTP server;

https://reviewboard.mozilla.org/r/172774/#review178222
Attachment #8901319 - Flags: review+
Comment on attachment 8901376 [details]
Bug 1356652 - Import Speedometer; rs=glandium

https://reviewboard.mozilla.org/r/172834/#review178226

rs=me
Attachment #8901376 - Flags: review+
(In reply to Mike Hommey [:glandium] from comment #74)
> I was not thinking about the webkit github mirror, but about some repo we
> have in the mozilla group that exposes speedometer through a github.io url
> (saw that in passing some time this week, don't remember where it is
> exactly). Not that it matters much anyways, after comment 70.

AWFY maintains a copy of Speedometer in this GitHub repo:

https://github.com/mozilla/arewefastyet-speedometer

You can run that repo's copy of Speedometer here:

https://mozilla.github.io/arewefastyet-speedometer/2.0/
Pause while we get feedback from legal. Get in touch with me if you have questions.
Flags: needinfo?(gerv)
Comment on attachment 8901319 [details]
Bug 1356652 - Expose Speedometer to PGO HTTP server;

https://reviewboard.mozilla.org/r/172774/#review178238
Attachment #8901319 - Flags: review+
Attachment #8901319 - Flags: review?(ehsan)
Hi everyone, I spoke with Ehsan and provided him some additional guidance, but should otherwise be good to go.
Comment on attachment 8901377 [details]
Bug 1356652 - Add a README_MOZILLA file to the Speedometer directory;

https://reviewboard.mozilla.org/r/172836/#review178726

::: third_party/speedometer/README_MOZILLA:3
(Diff revision 1)
> +The source from this directory was copied from the
> +PerformanceTests/Speedometer directory of the Webkit repository
> +at: https://svn.webkit.org/repository/webkit/trunk

Based on legal feedback, can you please add some description here on how the benchmark is being used currently, i.e., for the purpose of training the PGO profiling phase?

Thanks!
Comment on attachment 8901321 [details]
Bug 1356652 - Add Speedometer files to the PGO training list;

https://reviewboard.mozilla.org/r/172778/#review178736
Attachment #8901321 - Flags: review?(ted) → review+
Comment on attachment 8901377 [details]
Bug 1356652 - Add a README_MOZILLA file to the Speedometer directory;

https://reviewboard.mozilla.org/r/172836/#review178738
Attachment #8901377 - Flags: review?(ted) → review+
Comment on attachment 8901377 [details]
Bug 1356652 - Add a README_MOZILLA file to the Speedometer directory;

https://reviewboard.mozilla.org/r/172836/#review178740
Attachment #8901377 - Flags: review+
Comment on attachment 8901378 [details]
Bug 1356652 - Start test on load;

https://reviewboard.mozilla.org/r/172838/#review178742
Attachment #8901378 - Flags: review?(ted) → review+
Pushed by eakhgari@mozilla.com:
https://hg.mozilla.org/integration/autoland/rev/61a1a4248d7e
Import Speedometer; rs=glandium r=glandium
https://hg.mozilla.org/integration/autoland/rev/b04a0d52a2dc
Add a README_MOZILLA file to the Speedometer directory; r=Ehsan,ted
https://hg.mozilla.org/integration/autoland/rev/e511bc169074
Expose Speedometer to PGO HTTP server; r=froydnj,glandium
https://hg.mozilla.org/integration/autoland/rev/1ff881f1793e
Start test on load; r=ted
https://hg.mozilla.org/integration/autoland/rev/136611faf48d
Add Speedometer files to the PGO training list; r=ted
and there is 1 improvement from all this work:
== Change summary for alert #9078 (as of August 28 2017 21:50 UTC) ==

Improvements:

  5%  tsvgr_opacity summary linux64 pgo e10s     334.64 -> 317.42

For up to date results, see: https://treeherder.mozilla.org/perf.html#/alerts?id=9078
I don't know how long it takes AWFY to produce data, but I tried the "first release with"/"last release without" links from hgweb on my machine: https://hg.mozilla.org/mozilla-central/rev/136611faf48d

Before (Win32): 132.2 134.6 133.3
After (Win32) : 139.2 141.0 141.2  (+5.3%)

Before (Win64): 129.7 129.9 128.6
After (Win64) : 134.5 136.3 134.8  (+4.5%)
I'm not sure if it is something we should do, but now that we have Speedometer vendored in, it seems like it would be a bit easier to add Speedometer to Talos. I'll let Joel and others contemplate that.
I have been considering running speedometer in talos- now that it is in-tree that saves some hassle :)
(In reply to Joel Maher ( :jmaher) (UTC-5) from comment #105)
> I have been considering running speedometer in talos- now that it is in-tree
> that saves some hassle :)

I don't know if we have a bug for this, but it would be helpful to catch regressions like bug 1395727.  Right now we are relying on folks eyeballing the AWFY graphs, AFAIK.
Attachment #8900866 - Attachment is obsolete: true
Attachment #8900867 - Attachment is obsolete: true
Attachment #8900872 - Attachment is obsolete: true
Depends on: 1411661
Product: Core → Firefox Build System

See also: Bug to experiment with adding full pageload to the training set: Bug 1503559

Performance Impact: --- → P1
Whiteboard: [qf:p1]
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Creator:
Created:
Updated:
Size: