Performance runtests: add ability to display all metrics that will be used in testrun

ASSIGNED
Unassigned

Status

Tamarin
Tools
ASSIGNED
8 years ago
8 years ago

People

(Reporter: Chris Peyer, Unassigned)

Tracking

(Blocks: 1 bug)

unspecified
Future
Bug Flags:
flashplayer-qrb +
flashplayer-triage +

Details

(Reporter)

Description

8 years ago
From Felix:

Can we print out something at the very start of the runtests.py run
that says: "available metrics: ..." and lists the full set of metrics that we
will encounter in the run?  This is probably not trivial for now, I suppose,
since the metric indicator is produced by dynamically running the benchmark
itself, rather than being some sort of separate metadata.  And its not so bad
to make the user parse the generated by eye to determine what the set of
available metrics are...  But still, if you think of something real quick, it'd
be nice to be alerted up front that some benchmarks are going to generate
metric data that I did not think to include in the run.
Flags: flashplayer-triage+
Flags: flashplayer-qrb?

Updated

8 years ago
Assignee: nobody → cpeyer
Status: NEW → ASSIGNED
Flags: flashplayer-qrb? → flashplayer-qrb+
Target Milestone: --- → flash10.1.x-Salt

Updated

8 years ago
Target Milestone: flash10.1.x-Salt → flash10.2.x-Spicy

Updated

8 years ago
Target Milestone: flash10.2.x-Spicy → flash10.x - Serrano
(Reporter)

Updated

8 years ago
Blocks: 607714
(Reporter)

Comment 1

8 years ago
We would have to store the benchmarks associated with each test somewhere.  At the moment I don't see an easy fix for this, so I'm going to defer to future.
Assignee: cpeyer → nobody
Target Milestone: flash10.x - Serrano → Future
You need to log in before you can comment on or make changes to this bug.