don't dump out 'pending metrics' on Raptor benchmark timeouts, since there are none

RESOLVED FIXED in Firefox 68

Status

enhancement
P1
normal
RESOLVED FIXED
4 months ago
4 months ago

People

(Reporter: Bebe, Assigned: alexandrui)

Tracking

Version 3
mozilla68
Points:
---
Dependency tree / graph

Firefox Tracking Flags

(firefox68 fixed)

Details

Attachments

(1 attachment)

This is a continuation of Bug 1532671.
Add isBenchmarkPending to logged measurment list of runner.js

Assignee: nobody → alexandru.ionescu

Looking at this further, it really isn't necessary to even have 'pending measures' dumped at all when there's a timeout of a Raptor benchmark. It's obvious that it's a benchmark (and which one) in the existing timeout messages:

13:30:35 INFO - raptor-control-server received webext_raptor-page-timeout: [u'raptor-speedometer-firefox'...
...
13:30:52 INFO - raptor-main TEST-UNEXPECTED-FAIL: no raptor test results were found for raptor-speedometer-firefox

Instead of having 'pending metrics' dumped at all, please change this patch so that the pending metrics string will only be dumped out in the timeout message if running a page-load type of test. If running a benchmark type of test then don't add any 'pending metrics' at all. Thanks!

Flags: needinfo?(alexandru.ionescu)
Summary: add benchmark measurement to raptor error handeling → don't dump out 'pending metrics' on Raptor benchmark timeouts, since there are none

bebe?

Flags: needinfo?(alexandru.ionescu) → needinfo?(fstrugariu)

As discussed on IRC, I would suggest:

  • right from the raptor webext runner.js only include pendingMetrics in the control server page-timeout post if it is a page-load type test

  • adjust control_server.py self.results_handler.add_page_timeout - to only send in the 3rd data param if it exists/was received from the runner

  • adjust results.py def add_page_timeout(self, test_name, page_url, pending_metrics) to have pending_metrics=None (optional)

  • adjust results.py self.page_timeout_list.append to only add pending metrics if they were sent/not None

  • and in raptor.py line 700 LOG.critical("TEST-UNEXPECTED-FAIL: test '%s' timed out loading test page: %s "... only add the pending metrics line and values to this log statement if the test is a page-load test

Flags: needinfo?(fstrugariu)

:marauder, what is the status of this? Your patch in phab still needs a small update (and try pushes after that) thanks!

Status: NEW → ASSIGNED
Flags: needinfo?(marian.raiciof)
Priority: -- → P1

I am working on this. I will make the changes today.

Flags: needinfo?(marian.raiciof)
Pushed by rwood@mozilla.com:
https://hg.mozilla.org/integration/autoland/rev/ccfd61724e92
add benchmark measurement to raptor error handeling r=rwood
Status: ASSIGNED → RESOLVED
Closed: 4 months ago
Resolution: --- → FIXED
Target Milestone: --- → mozilla68
You need to log in before you can comment on or make changes to this bug.