Closed Bug 976360 Opened 10 years ago Closed 10 years ago

Write b2g-cert-test meta-harness

Categories

(Firefox OS Graveyard :: General, defect)

defect
Not set
normal

Tracking

(Not tracked)

RESOLVED FIXED

People

(Reporter: jgriffin, Assigned: ted)

References

Details

The B2G cert suite will be a collection of separate tools, like web-platform-tests and the guided WebAPI tests.  We'll need a meta-harness which is capable of running all or any of these, collecting output, handling errors, and generating a report.
Assignee: nobody → ted
We should consider whether we want to have the meta harness call out to the processes in the fxos-certsuite and the certtest repos or not.

If we do, we need to mandate they report test results back in a certain format.  Alternatively the former tests can be rewritten into actual test classes and test functions in one shared test harness.  

This solution would solve the problem of running subsets of tests, as it we wouldn't have to duplicate solutions to do that.  I think this would be the better approach long term, but unsure what the overhead is in relation to our delivery date.
Our Sprint 1 (March 17th) deliverable here is:

* A meta-harness that can be used by someone to invoke test suites separately or sequentially.
** Should be able to aggregate data from separate test runs.
** Should attempt to install Marionette (bug 976354), and if that fails, skip tests that require Marionette and run the remainder.
** Should split the existing tests in https://github.com/mozilla-b2g/fxos-certsuite into 3 suites:  omni-analyzer, webapi-verifier, and informer (the parts which gather info via adb).
** Should have an entry point for launching guided WebAPI tests (bug 975107), or a subset thereof.
** Should handle errors well; it should log all pertinent error information (stack traces, error message, etc) - we should assume that we will need to debug errors reported by partners using only the information which appears in the log generated by this harness.
Our Sprint 2 goal (due Mar 27) is to enhance the meta-harness with:

* packaging script to resolve dependency on PyPI, etc
* integration of marionette installation
* test filtering
I noticed the metaharness spits out two files in the zip: cert_results.json and certsuite-results.json.  Will we be keeping both? What's the distinction?
One is a summary view, and one contains the details.  We should name them more distinctly, though.
I didn't put a lot of thought into naming there, suggestions welcome. The former is the results.json from cert.py, it just gets included wholesale. The latter is a summary view of all test suites run (it's only running the one suite now so it's not super exciting).
(In reply to Ted Mielczarek [:ted.mielczarek] from comment #6)
> I didn't put a lot of thought into naming there, suggestions welcome. The
> former is the results.json from cert.py, it just gets included wholesale.
> The latter is a summary view of all test suites run (it's only running the
> one suite now so it's not super exciting).

Cool, seems like it has to be updated thanks to this change landed https://github.com/mozilla-b2g/fxos-certsuite/commit/7a4b64d267edf6b8988e853107cccfefc529bf6f over the weekend.  The results are not picked up by certsuite-results.json. Mine says PASS even though the results of the webapi IDL tests failed.
The meta-harness has been written; we can file bugs/enhancements as follow-ups.
Status: NEW → RESOLVED
Closed: 10 years ago
Resolution: --- → FIXED
You need to log in before you can comment on or make changes to this bug.