Compare view should allow viewing subtests across all tests
Categories
(Tree Management :: Perfherder, enhancement, P3)
Tracking
(Not tracked)
People
(Reporter: ted, Unassigned)
References
(Blocks 2 open bugs)
Details
Comment 1•8 years ago
|
||
Updated•7 years ago
|
Comment 2•6 years ago
|
||
:igoldan it turns out this will be useful for the tp6 page load tests, could you look into what it would take to support showing all subtests, or even specific subtests, across all tests? Basically, we want to see how the 'loadtime' subtest is impacted by a try run without needing to click into each test.
Updated•6 years ago
|
Comment 3•6 years ago
|
||
(In reply to Dave Hunt [:davehunt] [he/him] ⌚️UTC from comment #2)
:igoldan it turns out this will be useful for the tp6 page load tests, could you look into what it would take to support showing all subtests, or even specific subtests, across all tests? Basically, we want to see how the 'loadtime' subtest is impacted by a try run without needing to click into each test.
We 1st need to land bug 1509216. Once this is done, I'll follow up with an estimation + start working on this feature.
Updated•6 years ago
|
Updated•6 years ago
|
Comment 4•6 years ago
|
||
FYI - the performance/summary
endpoint is currently designed to query for either tests or subtests, not both (this is for performance reasons). If you want to see subtests for multiple tests, you'd need to modify the parent_signature
param here in order to accept multiple parent_signatures (assuming this is performant - otherwise it'll be n queries for each tests' subtests), and then the query would need to occur after the initial queries on the compare view resolves in order to get the parent signatures.
Updated•6 years ago
|
Updated•6 years ago
|
Comment 5•5 years ago
•
|
||
(In reply to Joel Maher ( :jmaher ) (UTC-4) from comment #1)
possibly a simple checkbox to "show all subtests" ?
Yes, I like this idea.
FYI - the performance/summary endpoint is currently designed to query for either tests or subtests, not both (this is for performance reasons). If you want to see subtests for multiple tests, you'd need to modify the parent_signature param here in order to accept multiple parent_signatures (assuming this is performant - otherwise it'll be n queries for each tests' subtests), and then the query would need to occur after the initial queries on the compare view resolves in order to get the parent signatures.
Indeed, these are the main aspects to consider when implementing this.
Basically, chain some promises that'll request from this API.
First promise will fetch the big tests. Then, if the "show all subtests" checkbox is checked, we'll
request from the same API (with different params) in digestable batches, until we go through all
the big tests. Maybe 2-3 big tests at a time.
In terms of UX, showing some spinning wheels under each big test will let the user know the UI is still waiting
for being fully loaded.
Updated•5 years ago
|
Comment 7•5 years ago
|
||
Indeed, these are the main aspects to consider when implementing this.
Basically, chain some promises that'll request from this API.
First promise will fetch the big tests. Then, if the "show all subtests" checkbox is checked, we'll
request from the same API (with different params) in digestable batches, until we go through all
the big tests. Maybe 2-3 big tests at a time.In terms of UX, showing some spinning wheels under each big test will let the user know the UI is still waiting
for being fully loaded.
What would be preferable is for each test to have it's own 'show subtests' button rather than having one checkbox to select for all tests. This would reduce the load on the database by only fetching what is explicitly needed.
Comment 8•5 years ago
|
||
(In reply to Sarah Clements [:sclements] from comment #7)
What would be preferable is for each test to have it's own 'show subtests' button rather than having one checkbox to select for all tests. This would reduce the load on the database by only fetching what is explicitly needed.
One use case for this is to see how a change affects the page load metrics across multiple tests (sites). For example, using this compare view we might want to filter to 'tp6', hide incomparable results, and then show all (or maybe even specific) subtests across all tests.
A performance improvement for loadtime subtest on amazon might cause a regression for reddit. Whilst the geomean can demonstrate this, our engineers are often focused on specific subtest metrics. By showing these subtests and promoting the compare view, we might reduce the chance of being suprised by regressions to our key metrics.
Comment 9•5 years ago
•
|
||
Would it make sense to implement it like the alerts view, where each test name/header has a 'show all subtests' checkbox and each platform of a test has its own checkbox. So subtests will be fetched for all platforms of a test or for selected platforms only?
And rather than trying to show subtests on the main compare view after the user selects 'show subtests' for specific tests, what about having a 'compare subtests' button that will take you to the compare subtests view? This might be a cleaner way of doing it rather than trying to show subtests on the main compare view page (if that was the original concept) and would eliminate the need to click into specific subtests per each platform/test. So this would essentially be a customizable subtests view.
Comment 10•5 years ago
|
||
(In reply to Sarah Clements [:sclements] from comment #9)
Would it make sense to implement it like the alerts view, where each test name/header has a 'show all subtests' checkbox and each platform of a test has its own checkbox. So subtests will be fetched for all platforms of a test or for selected platforms only?
Are you referring to the Graphs view, at the 'Include subtests' checkbox? Because Alerts view doesn't have the 'show all subtests' or a similar feature like that.
And rather than trying to show subtests on the main compare view after the user selects 'show subtests' for specific tests, what about having a 'compare subtests' button that will take you to the compare subtests view? This might be a cleaner way of doing it rather than trying to show subtests on the main compare view page (if that was the original concept) and would eliminate the need to click into specific subtests per each platform/test. So this would essentially be a customizable subtests view.
I like the idea of customizable subtests view. Basically, developers inspecting the Compare view results would need to open a max of 2 views: one for all tests and another one for all subtests.
Comment 11•5 years ago
•
|
||
I see that the Compare subtests view (e.g.) already has an 'Show all tests and platforms' hyperlink, on the top-left.
If we would add a similar hyperlink named 'Show all associated subtests and platforms' on the Compare tests view (e.g.), it would be great!
This will likely imply some extra changes to the Compare subtests view, as it currently fetches the subtests for a single parent signature, which limits the results to a single platform (Windows, Linux, OSX or Android). The changes are also on UI/UX.
Comment 12•5 years ago
•
|
||
(In reply to Ionuț Goldan [:igoldan], Performance Sheriff from comment #10)
(In reply to Sarah Clements [:sclements] from comment #9)
Would it make sense to implement it like the alerts view, where each test name/header has a 'show all subtests' checkbox and each platform of a test has its own checkbox. So subtests will be fetched for all platforms of a test or for selected platforms only?
Are you referring to the Graphs view, at the 'Include subtests' checkbox? Because Alerts view doesn't have the 'show all subtests' or a similar feature like that.
I did mean the Alerts view, but it was poorly worded. :) For any alert summary, you can check that checkbox to select all alerts or you can select only individual alerts. I think this would be an optimal way to select only what subtests are needed (in the compare view) to show in a customized subtests view. So then there'd be a button/link named 'Show selected subtests' to take a user to that view.
Comment 13•5 years ago
|
||
We weren't able to get to this in Q3, and have already planned Q4 work. Ionut do you think this should take priority over any of the Q4 work we have planned, or should we consider this for 2020?
Updated•5 years ago
|
Comment 14•5 years ago
•
|
||
(In reply to Dave Hunt [:davehunt] [he/him] ⌚BST from comment #13)
We weren't able to get to this in Q3, and have already planned Q4 work. Ionut do you think this should take priority over any of the Q4 work we have planned, or should we consider this for 2020?
We should consider this for 2020, as it requires quite a bit of effort to implement.
Updated•5 years ago
|
Updated•5 years ago
|
Comment 15•5 years ago
|
||
This is much older than 18 months, thus closing it as INCOMPLETE (but recorded it for our future roadmaps).
Updated•5 years ago
|
Comment 16•5 years ago
|
||
There has been a lot of interest in this, so I'd prefer to keep it open so we don't lose sight of it.
Comment 17•1 year ago
|
||
@beatrice: is this something we'll support in PerfCompare? If not, perhaps we should file a new ticket and close this with a link.
Comment 18•1 year ago
|
||
No, this was not previously discussed for PerfCompare, I've created a new ticket for it in Jira https://mozilla-hub.atlassian.net/browse/PCF-279.
Comment 19•1 year ago
|
||
Closing as WONTFIX for Perfherder, but it's being tracked for PerfCompare.
Description
•