Closed Bug 592850 Opened 15 years ago Closed 7 years ago

Performance runtests: add ability to display all metrics that will be used in testrun

Categories

(Tamarin Graveyard :: Tools, defect)

defect
Not set
normal

Tracking

(Not tracked)

RESOLVED WONTFIX
Future

People

(Reporter: cpeyer, Unassigned)

References

Details

From Felix: Can we print out something at the very start of the runtests.py run that says: "available metrics: ..." and lists the full set of metrics that we will encounter in the run? This is probably not trivial for now, I suppose, since the metric indicator is produced by dynamically running the benchmark itself, rather than being some sort of separate metadata. And its not so bad to make the user parse the generated by eye to determine what the set of available metrics are... But still, if you think of something real quick, it'd be nice to be alerted up front that some benchmarks are going to generate metric data that I did not think to include in the run.
Flags: flashplayer-triage+
Flags: flashplayer-qrb?
Assignee: nobody → cpeyer
Status: NEW → ASSIGNED
Flags: flashplayer-qrb? → flashplayer-qrb+
Target Milestone: --- → flash10.1.x-Salt
Target Milestone: flash10.1.x-Salt → flash10.2.x-Spicy
Target Milestone: flash10.2.x-Spicy → flash10.x - Serrano
Blocks: 607714
We would have to store the benchmarks associated with each test somewhere. At the moment I don't see an easy fix for this, so I'm going to defer to future.
Assignee: cpeyer → nobody
Target Milestone: flash10.x - Serrano → Future
No assignee, updating the status.
Status: ASSIGNED → NEW
No assignee, updating the status.
Tamarin is a dead project now. Mass WONTFIX.
Status: NEW → RESOLVED
Closed: 7 years ago
Resolution: --- → WONTFIX
Tamarin isn't maintained anymore. WONTFIX remaining bugs.
You need to log in before you can comment on or make changes to this bug.