Closed Bug 1350446 Opened 7 years ago Closed 7 years ago

Pull gcna/gcda from Taskcluster to produce human readable coverage locally?

Categories

(Testing :: Code Coverage, enhancement)

enhancement
Not set
normal

Tracking

(Not tracked)

RESOLVED FIXED

People

(Reporter: ekyle, Unassigned)

References

Details

There are currently only two ways to generate code coverage

1. Build locally, run tests locally, and use lcov tools to generate a coverage report
2. Trigger a try build with coverage option, and wait for coverage to appear in the coverage tool [1]

:mjzffr is exploring how to use one-click loaners, and taskcluster artifacts to produce coverage locally. If this works out, we many want to package the process into a script.

[1] https://ericdesj.github.io/moz-coco-w17-preview/
(In reply to Kyle Lahnakoski [:ekyle] from comment #0)
> There are currently only two ways to generate code coverage
> 
> 1. Build locally, run tests locally, and use lcov tools to generate a
> coverage report
> 2. Trigger a try build with coverage option, and wait for coverage to appear
> in the coverage tool [1]
> 
> :mjzffr is exploring how to use one-click loaners, and taskcluster artifacts
> to produce coverage locally. If this works out, we many want to package the
> process into a script.
> 
> [1] https://ericdesj.github.io/moz-coco-w17-preview/

There's now also:

3. Push to try and run https://github.com/marco-c/firefox-code-coverage/blob/master/code-coverage.py locally to download the gcno/gcda artifacts and generate a report (see https://developer.mozilla.org/en-US/docs/Mozilla/Testing/Measuring_Code_Coverage_on_Firefox#Generate_Code_Coverage_report_from_a_try_build_(or_any_other_treeherder_build))
I investigated running lcov and grcov on a One-Click Loaner. Here are my findings and steps to reproduce.

## Prequisites

* Push to try, e.g. try: -b o -p linux64-ccov -u mochitest -t none
* Go to B job on Treeherder and get one-click loaner. Once the build finishes on the loaner, you'll find target.code-coverage-gcno.zip in ~/artifacts

In the loaner's shell:
* Download gcda artifact from test job of interest (e.g. mochitest-1) on same try push (see link in its Job Details)
* Make the tools used by build task available: export PATH=/home/worker/workspace/build/src/gcc/bin:$PATH (This directory contains gcc 4.9.4. Otherwise, the default gcc version is 4.4.7, which is incompatible with the generated artifacts as well as grcov.) [1]
* Install lcov (yum install lcov) -- note that lcov 1.10 is the latest you can install on one-click-loaner


## for lcov
* Extract the gcno and gcda zip files into the src dir: ~/workspace/build/src
* lcov -d /home/worker/workspace/build/src -c --ignore-errors source,graph -o mochitest-e10s.info
* genhtml -o reports/coverage --show-details --highlight  --ignore-errors source --legend mochitest-e10s.info

When I run lcov, I get a lot of warnings like
> Processing config/external/icu/i18n/decContext.gcda
/home/worker/workspace/build/src/config/external/icu/i18n/decContext.gcda:stamp mismatch with notes file
> geninfo: WARNING: gcov did not create any files for /home/worker/workspace/build/src/config/external/icu/i18n/decContext.gcda!

You can get the resulting report here: https://github.com/mjzffr/ccov-reports/tree/master/lcov-one-click-210427-f170ebfc2dfd/coverage

## grcov (doesn't work, see https://github.com/marco-c/grcov/issues/20)
* Download and extract a grcov release
* Get code-coverage.py: git clone https://github.com/marco-c/firefox-code-coverage.git
* upgrade pip, install virtualenv, run pip install -r requirements.txt in the above repo
* Make directory expected by code-coverage.py: ~/ccov-artifacts and copy `*gcno.zip` and `*gcda.zip` there
* From ~: python code-coverage.py workspace/build/src $MH_BRANCH $GECKO_HEAD_REV --grcov path/to/grcov --no-download

However, I get the following errors with running grcov:
> ./grcov: /lib64/libc.so.6: version `GLIBC_2.14' not found (required by ./grcov)
> ./grcov: /lib64/libc.so.6: version `GLIBC_2.18' not found (required by ./grcov)

I tried exporting LD_LIRARY_PATH to include ~/workspace/build/src/gcc/lib64, but
that doesn't provide the missing library. 

## conclusions
* lcov "works" on a one-click loaner, but I can't judge the quality/accuracy of the report
* Right now, the best way to get a ccov report locally is #3 from Comment 1, but that's still resource-heavy, so it would be great to run code-coverage.py on a one-click loaner instead.

[1] If you don't add to the PATH, running lcov yields errors like:

> /home/worker/workspace/build/src/caps/Unified_cpp_caps0.gcno:version '409*', prefer '404R'
geninfo: ERROR: GCOV failed for /home/worker/workspace/build/src/caps/Unified_cpp_caps0.gcda!`
We managed to get this working (see https://github.com/marco-c/grcov/issues/20). There are a couple of things we can do to make the process easier: https://github.com/marco-c/firefox-code-coverage/issues/5 and https://github.com/marco-c/firefox-code-coverage/issues/6. With these two issues fixed, users only have to get a one-click loaner, add the GCC used during the build to the PATH (or we could even do it as part of the code-coverage.py script) and run the code-coverage.py script.
A note: the option with lcov is actually only parsing one artifact, as when you unzip the artifacts in the same directory the gcda files are overwritten (this is why the lcov report only shows 5% coverage).
(In reply to Marco Castelluccio [:marco] from comment #4)
> A note: the option with lcov is actually only parsing one artifact, as when
> you unzip the artifacts in the same directory the gcda files are overwritten
> (this is why the lcov report only shows 5% coverage).

Yep, good to point that out: I intentionally only use the artifact for mochitest-1 in the example, so low coverage in the report is expected.
I thought I would pitch in with some automation that I built eons ago: <https://github.com/jcranmer/m-c-tools-code-coverage/blob/master/collect-try-results.py>. The model back then was to attach a patch to mozilla-central, push to try with try: -b do -p linux32,linux64 -u all -t none, wait for results to complete, and then run this script. Since lcov was slow as balls, I rewrote a lot of lcov in python as <https://github.com/jcranmer/mozilla-coverage>, which also produces something that I personally feel is slightly nicer output than the original lcov output (as well as including my treemap view of the code coverage).

It's not been updated for 2 years, so the code probably needs a fair amount of finagling to work correctly (in particular, the treeherder API seems to change every time I look at it). Once you get past the mess that is figuring out which .gcno/.gcda files are actually available, the code should more or less work at that point (barring the issue that I hardcode gcov-4.8 because the builders were gcc 4.8 at that time). Which is to say that CoverageCollecter::computeCoverage more or less represents how to process one .gcno/.gcda pair, and the tail of the function collect_all_coverage represents how to combine them.

The main steps boil down to:
for each platform:
  download gcno zip
  for each test:
    download gcda zip
    extract gcda zip to a temporary directory
    find the common root of the gcno/gcda zip, extract gcno zip at that location
    remove jchuff.gcda (that step is probably unnecessary now, the bug is fixed in GCC, I think the version we're using)
    run ccov/lcov on the extracted directory, save to $OUT/$TEST-pre.info with testname $TNAME
    extract only files matching /build/* from $OUT/$TEST-pre.info, save to $OUT/test.info [this filters out stuff like /usr/include/c++/blah)
    rewrite the absolute paths to relative paths from mozilla-central (using sed)
    delete $OUT/$TEST-pre.info and the temporary directory
combine all the $TEST-pre.info into an all.info
run whatever tools you want on all.info (e.g., the ccov make_ui.py script)

Note that I retained high-level test names (e.g., xpcshell, mochitest-browser-chrome) but combined the xpcshell-N suites into xpcshell, etc. Originally, I also wanted to break down by platform (linux32, linux64, opt, debug, etc.), but Python memory restrictions killed that project.
I think we can call this fixed.
Status: NEW → RESOLVED
Closed: 7 years ago
Resolution: --- → FIXED
You need to log in before you can comment on or make changes to this bug.