Created attachment 8558056 [details] v2dot1.png We saw there is no Flame v2.1 cold launch time related data from 9th Dec., 2014. @ Datazilla - Flame v2.1 link: https://datazilla.mozilla.org/?start=1416771974&stop=1422599988&product=B2G&repository=v2.1&os=Firefox OS&os_version=220.127.116.11-prerelease&test=sms&x86=false&x86_64=false&error_bars=false&compare_product=B2G&compare_repository=v2.1&project=b2g @ Screenshot: v2dot1.png
Hi, Eli, May I have your help? We have kicked off Dolphin v2.1. So, stakeholders want to access Flame v2.1 cold launch time related data to do performance comparison. But, We found there is no Flame v2.1 cold launch time related data on datazilla from 9th Dec, 2014. Do you know the root cause? Many thanks!
Related logs ============ 07:48:27 + gaiaperf --sources=/var/jenkins/1/workspace/flame-kk-319.mozilla-b2g34_v2_1.perf.gaia/sources.xml --dz-project=b2g --dz-branch=v2.1 --dz-device=flame-319MB --dz-key=**** --dz-secret=**** --dz-build-url=http://jenkins1.qa.scl3.mozilla.com/job/flame-kk-319.mozilla-b2g34_v2_1.perf.gaia/443/ /var/jenkins/1/workspace/flame-kk-319.mozilla-b2g34_v2_1.perf.gaia/perf.json 07:48:31 Traceback (most recent call last): 07:48:31 File "/var/jenkins/1/workspace/flame-kk-319.mozilla-b2g34_v2_1.perf.gaia/.env/bin/gaiaperf", line 9, in <module> 07:48:31 load_entry_point('b2gperf==0.32', 'console_scripts', 'gaiaperf')() 07:48:31 File "/var/jenkins/1/workspace/flame-kk-319.mozilla-b2g34_v2_1.perf.gaia/.env/local/lib/python2.7/site-packages/b2gperf/mozperf.py", line 94, in cli 07:48:31 device_serial=options.device_serial) 07:48:31 File "/var/jenkins/1/workspace/flame-kk-319.mozilla-b2g34_v2_1.perf.gaia/.env/local/lib/python2.7/site-packages/b2gperf/b2gperf.py", line 70, in __init__ 07:48:31 settings = gaiatest.GaiaData(self.marionette).all_settings 07:48:31 File "/var/jenkins/1/workspace/flame-kk-319.mozilla-b2g34_v2_1.perf.gaia/.env/local/lib/python2.7/site-packages/gaiatest/gaia_test.py", line 148, in __init__ 07:48:31 self.apps = GaiaApps(marionette) 07:48:31 File "/var/jenkins/1/workspace/flame-kk-319.mozilla-b2g34_v2_1.perf.gaia/.env/local/lib/python2.7/site-packages/gaiatest/gaia_test.py", line 41, in __init__ 07:48:31 self.marionette.import_script(js) 07:48:31 File "/var/jenkins/1/workspace/flame-kk-319.mozilla-b2g34_v2_1.perf.gaia/.env/local/lib/python2.7/site-packages/marionette/marionette.py", line 1406, in import_script 07:48:31 return self._send_message('importScript', 'ok', script=js) 07:48:31 File "/var/jenkins/1/workspace/flame-kk-319.mozilla-b2g34_v2_1.perf.gaia/.env/local/lib/python2.7/site-packages/marionette/decorators.py", line 36, in _ 07:48:31 return func(*args, **kwargs) 07:48:31 File "/var/jenkins/1/workspace/flame-kk-319.mozilla-b2g34_v2_1.perf.gaia/.env/local/lib/python2.7/site-packages/marionette/marionette.py", line 596, in _send_message 07:48:31 raise errors.MarionetteException("Please start a session") 07:48:31 marionette.errors.MarionetteException: MarionetteException: Please start a session 07:48:31 07:48:31 Build step 'Execute managed script' marked build as failure 07:48:31 Archiving artifacts 07:48:31 Finished: FAILURE =========== Dave, I'm not overly familiar with the gaiaperf script, would you happen to know what may have gone wrong here?
Assignee: nobody → eperelman
OS: Windows 7 → Gonk (Firefox OS)
Hardware: x86_64 → ARM
This coincides with https://github.com/mozilla-b2g/gaia/commit/dd1b3ae3dd10e5a897b77938fe6dde2e47190e84 I suspect that changes to gaiatest to continue supporting the master/mozilla-central builds has affected b2gperf running against older branches. Unfortunately even though we no longer use b2gperf to gather performance data, it is still used by make test-perf to submit the data to datazilla. The solution is likely to release a gaiatest-v2.1 package from the v2.1 branch of gaia, and to create a v2.1 branch/package of b2gperf for use with those builds.
I've just noticed that the 2.0 job is also failing with a similar trace, and that's not using b2gperf-v2.0, which it probably should be. I'll see if I can get these running again. I haven't been looking at the performance results for some months now as I've been aware of some issues that Eli was working on (and the new framework that he's developing).
Cool! Thanks Dave and Eli! :D
2.1 is fixed now, but I notice we're not running tests against v2.2. Perhaps this isn't an issue though?
Status: ASSIGNED → RESOLVED
Last Resolved: 3 years ago
Resolution: --- → FIXED
WoW! I see the data. I will tell stakeholders the good news. Thanks Dave and Eli! (+1 like)
You need to log in before you can comment on or make changes to this bug.