Closed Bug 837673 Opened 11 years ago Closed 11 years ago

[contacts] "ready to use" perf measurement

Categories

(Firefox OS Graveyard :: Gaia::Contacts, defect)

ARM
Gonk (Firefox OS)
defect
Not set
normal

Tracking

(b2g18+ fixed)

RESOLVED FIXED
Tracking Status
b2g18 + fixed

People

(Reporter: julienw, Assigned: julienw)

References

Details

Attachments

(1 file, 2 obsolete files)

We need to measure when the app is usable by the user. For that we'll need to send an event (the moment is specific to the app) to |window| that the performance test will be able to receive.

The event name can be "x-moz-perf-user-ready" amongst all apps so that the performance test can be similar.

For the Contacts app, it should be triggered when the contact list is displayed and the user can actually manipulate the list (scroll / actionnate the controls).

Wondering if we should do 2 measurements: for scroll and for the other actions.
No longer depends on: gaia-perf-measure
Summary: "ready to use" perf measurement → [contacts] "ready to use" perf measurement
A first work will be done in Bug 837139.
Depends on: 837139
Assignee: nobody → felash
Attached patch patch v1 (obsolete) — Splinter Review
- rename ContactsRenderingPerformance to PerformanceHelperAtom
- store all received events in the PerformanceHelperAtom, in a local variable
  instead of a global variable.
- Start refactoring performance helper to make it a instanciable object. In the
  future, it will probably handle the loops too.
---
 apps/communications/contacts/js/contacts_list.js   |    6 ++
 .../test/atoms/contacts_rendering_performance.js   |   52 --------------
 .../contacts/test/integration/app.js               |   26 -------
 .../contacts/test/performance/rendering_test.js    |   42 ++++++++----
 shared/js/performance_helper.js                    |    7 +-
 tests/js/app_integration.js                        |   30 ++++++++-
 tests/performance/performance_helper.js            |   49 +++++++++++---
 tests/performance/performance_helper_atom.js       |   71 ++++++++++++++++++++
 tests/performance/startup_test.js                  |    6 +-
 9 files changed, 181 insertions(+), 108 deletions(-)
 delete mode 100644 apps/communications/contacts/test/atoms/contacts_rendering_performance.js
 create mode 100644 tests/performance/performance_helper_atom.js


I still want to look if the performance events should be sent inside a setTimeout or not, because they're synchronous. Also the first event is never caught and I'd like to see if I could make it caught. But all this could be done in a follow-up as I'd like to have the basic stuff landed so that other people could begin to work on other apps.
Attachment #717943 - Flags: review?(anthony)
Attachment #717943 - Flags: feedback?
I think it would be handy if there dispatchPerfEvent included a comment inside or outside along these lines:
// dispatchEvent runs synchronously, so defer its invocation so we don't artificially inflate the run-time of our caller.
// This is also reasonable for performance events since we won't be able to respond to the user until our caller yields control back to the event loop anyways.
Separate thought: is there a reason you don't use pull requests?  While I do prefer bugzilla's side-by-side diffs to github, there are a lot of advantages of being able to just pull your gaia fork to try out code you have added or to use "git difftool" for a better review experience, etc.
(In reply to Andrew Sutherland (:asuth) from comment #4)
> Separate thought: is there a reason you don't use pull requests?  While I do
> prefer bugzilla's side-by-side diffs to github, there are a lot of
> advantages of being able to just pull your gaia fork to try out code you
> have added or to use "git difftool" for a better review experience, etc.

Yep, if I knew in advance that you'd want to try the code before it lands, I would have definitely use a PR (and when I saw your message it was too late).

However, "wget -O - <attachment url> | git am" works very well.
Comment on attachment 717943 [details] [diff] [review]
patch v1

Review of attachment 717943 [details] [diff] [review]:
-----------------------------------------------------------------

::: shared/js/performance_helper.js
@@ +4,5 @@
>  
>    function dispatchPerfEvent(name) {
> +      var evt = new CustomEvent('x-moz-perf', { detail: { name: name } });
> +      setTimeout(function() {
> +        console.log('PerformanceHelper: dispatching event', name);

I'll comment the log here.

::: tests/performance/performance_helper_atom.js
@@ +3,5 @@
> +(function(window) {
> +  var perfEvent = 'x-moz-perf',
> +      perfMeasurements = Object.create(null);
> +
> +  var DEBUG = true;

I'll put this to "false" before landing.
Attachment #717943 - Flags: feedback?
Comment on attachment 717943 [details] [diff] [review]
patch v1

Review of attachment 717943 [details] [diff] [review]:
-----------------------------------------------------------------

Looking good overall, except for the exact time we store.

::: apps/communications/contacts/test/performance/rendering_test.js
@@ +30,5 @@
>  
> +    var name;
> +    var results = {};
> +
> +    // FIXME have all this in performanceHelper

What's the bug number? :)

::: shared/js/performance_helper.js
@@ +1,5 @@
>  'use strict';
>  
>  (function(window) {
>  
>    function dispatchPerfEvent(name) {

It would be nice to not do anything if we're not testing performance. We shouldn't execute more code than necessary on a end-user device.

@@ +2,5 @@
>  
>  (function(window) {
>  
>    function dispatchPerfEvent(name) {
> +      var evt = new CustomEvent('x-moz-perf', { detail: { name: name } });

We should record the time here. Otherwise, we record when the event was actually dispatched, not when the app declared it reached this state.

::: tests/performance/performance_helper.js
@@ +11,5 @@
> +      }
> +    }
> +  }
> +
> +  /* opts can have the followint keys:

typo: following

@@ +12,5 @@
> +    }
> +  }
> +
> +  /* opts can have the followint keys:
> +   * - spawnInterval (optional): defines how much seconds we must wait before

how many

@@ +14,5 @@
> +
> +  /* opts can have the followint keys:
> +   * - spawnInterval (optional): defines how much seconds we must wait before
> +   *   launching an app. Default is 6000.
> +   * - runs (optional): defines how much runs we should do. Default is 5.

ditto

@@ +34,5 @@
> +    extend(this, opts);
> +  }
> +
> +  extend(PerformanceHelper, {
> +    // FIXME encapsulate this in a nice object like ContactsRenderingPerformance

Nothing specific to Contacts here.

::: tests/performance/performance_helper_atom.js
@@ +46,5 @@
> +    marionetteScriptFinished(perfMeasurements);
> +  }
> +
> +  window.PerformanceHelperAtom = {
> +    register: function() {

This function should check if a listener has already been registered. Otherwise you'll run into bug 842616.
Attachment #717943 - Flags: review?(anthony)
Attached patch patch v2 (obsolete) — Splinter Review
fixed reviews from Rik.

- rename ContactsRenderingPerformance to PerformanceHelperAtom
- store all received events in the PerformanceHelperAtom, in a local variable
  instead of a global variable.
- Start refactoring performance helper to make it a instanciable object. In the
  future, it will probably handle the loops too.
- don't send perf events if we don't listen
---
 apps/communications/contacts/js/contacts.js        |    1 +
 apps/communications/contacts/js/contacts_list.js   |    5 ++
 .../test/atoms/contacts_rendering_performance.js   |   52 -----------
 .../contacts/test/integration/app.js               |   26 ------
 .../contacts/test/performance/rendering_test.js    |   38 +++++---
 shared/js/performance_helper.js                    |   21 ++++-
 tests/js/app_integration.js                        |   30 ++++++-
 tests/performance/performance_helper.js            |   92 +++++++++++++++++---
 tests/performance/performance_helper_atom.js       |   82 +++++++++++++++++
 tests/performance/startup_test.js                  |    9 +-
 10 files changed, 246 insertions(+), 110 deletions(-)
 delete mode 100644 apps/communications/contacts/test/atoms/contacts_rendering_performance.js
 create mode 100644 tests/performance/performance_helper_atom.js
Attachment #717943 - Attachment is obsolete: true
Attachment #719533 - Flags: review?(anthony)
Comment on attachment 719533 [details] [diff] [review]
patch v2

stealing review per :julenw's request
Attachment #719533 - Flags: review?(anthony) → review?(jlal)
Depends on: 846871
No longer depends on: 846871
Comment on attachment 719533 [details] [diff] [review]
patch v2

updated the PR in https://github.com/mozilla-b2g/gaia/pull/8386

will squash and upload a new patch here when it's ready.

Also, the tests fail when there is no contact at all, I filed Bug 846871 for that.
James, I need a r+ here to land ;-)
Comment on attachment 719533 [details] [diff] [review]
patch v2

R+ but travis is failing seems like you had a mock for the performance helper or something that is missing/changed? Please fix the test failure first then its good to go.
Attachment #719533 - Flags: review?(jlal) → review+
strange, I think I ran the tests locally and didn't see that. Thanks Travis :)

But I exactly know what it is, will fix it tomorrow. Thanks !
Attached patch patch v3Splinter Review
carrying r=jlal
fixed tests

see also PR https://github.com/mozilla-b2g/gaia/pull/8386
Attachment #719533 - Attachment is obsolete: true
Attachment #720164 - Flags: review+
master: https://github.com/mozilla-b2g/gaia/commit/dac9b2a8a36305844651aba96a3a1904e74e648d

finally fixed it now :)
Status: NEW → RESOLVED
Closed: 11 years ago
Resolution: --- → FIXED
tracking-b2g18: --- → ?
Comment on attachment 720164 [details] [diff] [review]
patch v3

Asking for approval because we want to test the performance also in the v1-train branch.

There is a very small change in the Contacts app (6 lines more, 1 line changed), which send an asynchronous event, so there is no risk. the other changes are in the test framework.
Attachment #720164 - Flags: approval-gaia-v1?(21)
Comment on attachment 720164 [details] [diff] [review]
patch v3

Approving for v1.1 - please add qawanted and provide testing details if there's any chance of a new user perf regression introduced by these changes (as ironic as that sounds).
Attachment #720164 - Flags: approval-gaia-v1?(21) → approval-gaia-v1+
I was not able to uplift this bug to v1-train.  If this bug has dependencies which are not marked in this bug, please comment on this bug.  If this bug depends on patches that aren't approved for v1-train, we need to re-evaluate the approval.  Otherwise, if this is just a merge conflict, you might be able to resolve it with:

  git checkout v1-train
  git cherry-pick -x  dac9b2a8a36305844651aba96a3a1904e74e648d
  <RESOLVE MERGE CONFLICTS>
  git commit
This probably needed Bug 837139 to be uplifted too. I asked approval there.
Bug 838527 will need to be uplifted first too, I asked approval there too.
v1-train: b05ee4e8cc45b9b4400308716dd66e8fec91195e
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Created:
Updated:
Size: