[meta] Review performance benchmarks in automation
Categories
(Testing :: Performance, task)
Tracking
(Not tracked)
People
(Reporter: Bebe, Unassigned)
References
(Blocks 1 open bug)
Details
(Keywords: meta)
Browsertime is using some custom scripts(benchmarks) to measure various performance data points.
These scripts are years old and need to be upodated.
Develop a process and update/review these benchmarks.
The first step should be to discuss with the performance team and draft a process for reviewing the benchmarks.
Things that come to mind:
Is this benchmark still relevant?
Which platforms are appropriate for the benchmark?
Which metrics/suites within the benchmark should we run?
Should we sheriff the results for the benchmark?
Which team is responsible for the benchmark?
This could be something we review annually
| Reporter | ||
Updated•4 years ago
|
| Reporter | ||
Updated•4 years ago
|
| Reporter | ||
Updated•4 years ago
|
Updated•4 years ago
|
Comment 1•4 years ago
|
||
We should also consider adding benchmarks as part of the review.
Comment 2•3 years ago
|
||
The meta keyword is there, the bug doesn't depend on other bugs and there is no activity for 12 months.
:davehunt, maybe it's time to close this bug?
Updated•3 years ago
|
Description
•