Closed
Bug 840199
Opened 12 years ago
Closed 12 years ago
Write a wrapper for 'make test-perf' that can post the data to datazilla
Categories
(Firefox OS Graveyard :: Gaia, defect)
Firefox OS Graveyard
Gaia
Tracking
(Not tracked)
RESOLVED
FIXED
People
(Reporter: jgriffin, Assigned: jgriffin)
References
Details
In order to graph the data produced by 'make test-perf', we'll need to write a wrapper to process the output JSON that is produced by these tests, and post to datazilla.
Assignee | ||
Comment 1•12 years ago
|
||
These tests are now invoked like so:
NO_DEBUG_OUTPUT=1 APPS="sms email" make -s test-perf
and they produce output like the following (to stdout):
[
{
"stats": {
"suites": 1,
"tests": 1,
"passes": 1,
"pending": 0,
"failures": 0,
"start": "2013-02-20T21:17:05.178Z",
"end": "2013-02-20T21:17:57.015Z",
"duration": 51837,
"application": "sms"
},
"failures": [],
"passes": [
{
"title": "startup time ",
"fullTitle": "sms > startup time ",
"duration": 51420,
"mozPerfDurations": [
1562,
1562,
1496,
1496,
1508,
1508,
1505,
1505,
2474,
2474
],
"mozPerfDurationsAverage": 1709
}
]
}
,
{
"stats": {
"suites": 1,
"tests": 1,
"passes": 1,
"pending": 0,
"failures": 0,
"start": "2013-02-20T21:17:57.112Z",
"end": "2013-02-20T21:18:53.499Z",
"duration": 56387,
"application": "email"
},
"failures": [],
"passes": [
{
"title": "startup time ",
"fullTitle": "email > startup time ",
"duration": 55973,
"mozPerfDurations": [
3822,
3822,
3822,
2907,
2907,
2907,
2931,
2931,
2931,
2892,
2892,
2892,
2947,
2947,
2947
],
"mozPerfDurationsAverage": 3099.8
}
]
}
]
Assignee | ||
Comment 2•12 years ago
|
||
I added a Jenkins job for this, http://qa-selenium.mv.mozilla.com:8080/view/B2G/job/b2g.unagi.gaia.master.mozperfdurations, which I'm trying out now, sans the datazilla submission.
Assignee | ||
Comment 3•12 years ago
|
||
The Jenkins job failed with:
"stack": "Error: done() invoked with non-Error: JavaScriptError: (17) TypeError: app is null\nRemote Stack:\n<none>\n at Runnable.prototype.run/< (/var/jenkins/workspace/b2g.unagi.gaia.master.mozperfdurations/test_apps/test-agent/common/vendor/mocha/mocha.js:3711)\n at overload/wrapDone/< (/var/jenkins/workspace/b2g.unagi.gaia.master.mozperfdurations/test_apps/test-agent/common/test/mocha_generators.js:46)\n at overload/window[type]/wrapper/< (/var/jenkins/workspace/b2g.unagi.gaia.master.mozperfdurations/test_apps/test-agent/common/test/mocha_generators.js:69)\n at MochaTask.nextNodeStyle (/var/jenkins/workspace/b2g.unagi.gaia.master.mozperfdurations/test_apps/test-agent/common/test/mocha_task.js:96)\n at next (/var/jenkins/workspace/b2g.unagi.gaia.master.mozperfdurations/tests/js/app_integration.js:159)\n at Client.prototype._handleCallback (/var/jenkins/workspace/b2g.unagi.gaia.master.mozperfdurations/tests/js/vendor/marionette.js:1309)\n at Client.prototype._sendCommand/< (/var/jenkins/workspace/b2g.unagi.gaia.master.mozperfdurations/tests/js/vendor/marionette.js:1329)\n at _onDeviceResponse (/var/jenkins/workspace/b2g.unagi.gaia.master.mozperfdurations/tests/js/vendor/marionette.js:2149)\n at Tcp.prototype._onClientCommand (/var/jenkins/workspace/b2g.unagi.gaia.master.mozperfdurations/tests/js/vendor/marionette.js:2381)\n at emit/< (/var/jenkins/workspace/b2g.unagi.gaia.master.mozperfdurations/tests/js/vendor/marionette.js:156)\n at emit (/var/jenkins/workspace/b2g.unagi.gaia.master.mozperfdurations/tests/js/vendor/marionette.js:155)\n at _handleCommand (/var/jenkins/workspace/b2g.unagi.gaia.master.mozperfdurations/tests/js/vendor/marionette.js:818)\n at _readBuffer (/var/jenkins/workspace/b2g.unagi.gaia.master.mozperfdurations/tests/js/vendor/marionette.js:858)\n at add (/var/jenkins/workspace/b2g.unagi.gaia.master.mozperfdurations/tests/js/vendor/marionette.js:897)\n at emit/< (/var/jenkins/workspace/b2g.unagi.gaia.master.mozperfdurations/tests/js/vendor/marionette.js:156)\n at emit (/var/jenkins/workspace/b2g.unagi.gaia.master.mozperfdurations/tests/js/vendor/marionette.js:155)\n at SocketWrapper/</rawSocket[method]< (/var/jenkins/workspace/b2g.unagi.gaia.master.mozperfdurations/tests/js/vendor/marionette.js:2298)\n at ts_callListener (/var/jenkins/workspace/b2g.unagi.gaia.master.mozperfdurations/xulrunner-sdk/bin/components/TCPSocket.js:253)\n at ts_onDataAvailable (/var/jenkins/workspace/b2g.unagi.gaia.master.mozperfdurations/xulrunner-sdk/bin/components/TCPSocket.js:508)\n",
I'll have to try and reproduce using the same build as Jenkins is using.
Updated•12 years ago
|
Comment 4•12 years ago
|
||
just some precisions:
- NO_DEBUG_OUTPUT=1 is not used now. We don't show output by default now, and to show output we have to use VERBOSE=1
- if you don't use the APPS environment variable, it runs the tests for all apps. Some are failing now.
- some durations are present several times, this is Bug 842616
Updated•12 years ago
|
Blocks: gaia-perf-measure
Comment 5•12 years ago
|
||
"app is null" message is Bug 841728.
It happens when you have an app in your apps/ directory that does not exist on the device. This happens for example when an app is removed from the apps/ directory in the git repository.
Assignee | ||
Comment 6•12 years ago
|
||
The error above was caused by a casing problem: APP=Email fails, APP=email succeeds.
Assignee | ||
Comment 7•12 years ago
|
||
Assignee: nobody → jgriffin
Assignee | ||
Comment 8•12 years ago
|
||
This is running in production now as job b2g.unagi.gaia.master.mozperftest. davehunt, can you add the datazilla OAUTH credentials to that job when you get a chance? I don't know them.
Comment 9•12 years ago
|
||
I've added the credentials to the job and triggered a build. I've also forwarded these to your mozilla.com email for future reference.
Comment 10•12 years ago
|
||
The data is showing up at https://datazilla.mozilla.org/b2g/ but a lot of the applications are duplications. The duplicates are:
contacts | communications/contacts
phone | comminications/dialer
messages | sms
The b2gperf tests use the application name, however I suspect the test-perf tests are using the entry point or folder hierarchy.
Another option is to only show apps with data for the selected 'test' in DataZilla.
Comment 11•12 years ago
|
||
For most apps there is only one test (the startup measurement) right now but we plan to add more next week.
We may remove the first part for the "communications/*" app as this doesn't add much information. However we decided to use the folder name (which is also the first part of the manifest url) because the name is locale-dependent.
So you have seen the "passes" array for each app; each app will have multiple item in the "passes" array. The "title" entry is the test title, the "fullTitle" entry gets the app name too.
Is it possible, in datazilla, to show all test results for a specific app ?
Comment 12•12 years ago
|
||
(In reply to Julien Wajsberg [:julienw] from comment #11)
> For most apps there is only one test (the startup measurement) right now but
> we plan to add more next week.
>
> We may remove the first part for the "communications/*" app as this doesn't
> add much information. However we decided to use the folder name (which is
> also the first part of the manifest url) because the name is
> locale-dependent.
The following apps are also tested using b2gperf, which measures the time to the apploadtime event after initiating a launch: Phone Contacts Messages Settings Gallery Music Camera Email Calendar Marketplace
As this is performed external to the Gaia source repository, the folder names are not available. This is causing applications to appear twice in DataZilla.
> Is it possible, in datazilla, to show all test results for a specific app ?
It sounds to me like you would require another custom view to show this data. The b2gperf harness currently only takes one measurement, but is likely to be enhanced to include more.
Comment 13•12 years ago
|
||
(In reply to Dave Hunt (:davehunt) from comment #12)
> (In reply to Julien Wajsberg [:julienw] from comment #11)
> > For most apps there is only one test (the startup measurement) right now but
> > we plan to add more next week.
> >
> > We may remove the first part for the "communications/*" app as this doesn't
> > add much information. However we decided to use the folder name (which is
> > also the first part of the manifest url) because the name is
> > locale-dependent.
>
> The following apps are also tested using b2gperf, which measures the time to
> the apploadtime event after initiating a launch: Phone Contacts Messages
> Settings Gallery Music Camera Email Calendar Marketplace
>
> As this is performed external to the Gaia source repository, the folder
> names are not available. This is causing applications to appear twice in
> DataZilla.
Our current measurement is doing just that: the apploadtime event. So maybe, when our perf measurements will be stable enough, we might just remove the former ?
>
> > Is it possible, in datazilla, to show all test results for a specific app ?
>
> It sounds to me like you would require another custom view to show this
> data. The b2gperf harness currently only takes one measurement, but is
> likely to be enhanced to include more.
The first thing is : I want to make sure that the additional measurement we'll add soon (next week) will show up in datazilla.
For example, in the script in [1], I believe you're not using "title" or "fullTitle" at all, so I'm afraid that all measurements will end up in the same averaged value, which is wrong of course.
[1] https://github.com/davehunt/b2gperf/pull/1/files
Comment 14•12 years ago
|
||
(In reply to Julien Wajsberg [:julienw] from comment #13)
> Our current measurement is doing just that: the apploadtime event. So maybe,
> when our perf measurements will be stable enough, we might just remove the
> former ?
Perhaps, but I think I'd like to understand more the differences between the two harnesses before we remove anything that's currently being used.
> > > Is it possible, in datazilla, to show all test results for a specific app ?
> >
> > It sounds to me like you would require another custom view to show this
> > data. The b2gperf harness currently only takes one measurement, but is
> > likely to be enhanced to include more.
>
> The first thing is : I want to make sure that the additional measurement
> we'll add soon (next week) will show up in datazilla.
>
> For example, in the script in [1], I believe you're not using "title" or
> "fullTitle" at all, so I'm afraid that all measurements will end up in the
> same averaged value, which is wrong of course.
>
> [1] https://github.com/davehunt/b2gperf/pull/1/files
The test name is taken from the title, as shown here: https://github.com/davehunt/b2gperf/commit/ad247ebc5243c182a96550a338f63eac78f6cfd8
Comment 15•12 years ago
|
||
Also, I've since added more apps to the b2gperf measurements, which has added the following duplicate app names:
ftu | communications/ftu
fm_radio | fm
Comment 16•12 years ago
|
||
(In reply to Dave Hunt (:davehunt) from comment #14)
> The test name is taken from the title, as shown here:
> https://github.com/davehunt/b2gperf/commit/
> ad247ebc5243c182a96550a338f63eac78f6cfd8
Perfect, thanks !
> Also, I've since added more apps to the b2gperf measurements, which has
> added the following duplicate app names:
>
> ftu | communications/ftu
> fm_radio | fm
We'll definitely remove the useless "communications/" prefix which will already remove some duplicate.
How do you take your app names ? Is it arbitrary or does this come from the manifest ?
Comment 17•12 years ago
|
||
We use the locateWithName method in the atoms to launch the apps. This matches the name against the entry points or manifest name. See https://github.com/mozilla/gaia-ui-tests/blob/master/gaiatest/atoms/gaia_apps.js#L66
Comment 18•12 years ago
|
||
While creating our perf tests, we decided to add new methods to the atom so that launching an app would not be locale-dependent: launchWithManifestURL and closeWithManifestURL.
Assignee | ||
Comment 19•12 years ago
|
||
This is in production so I'm going to close this; we can file other bugs if needed for removing the redundant "communications/" prefix.
Status: NEW → RESOLVED
Closed: 12 years ago
Resolution: --- → FIXED
Comment 20•12 years ago
|
||
This is already opened as bug 844756 ;)
You need to log in
before you can comment on or make changes to this bug.
Description
•