Open Bug 467120 Opened 15 years ago Updated 7 years ago

custom 'avg' calculation based upon test type

Categories

(Webtools Graveyard :: Graph Server, defect)

x86
macOS
defect
Not set
normal

Tracking

(Not tracked)

REOPENED

People

(Reporter: jrmuizel, Unassigned)

Details

When getArrayStats() in framecycler.html is run on the 'avgs' and 'medians' arrays the maximum value is discarded. I can't see how this would be desired behaviour. This will allow the page that takes the longest to load to get much slower without any change to the reported values.
Moving to correct talos category.
Component: Talos → Release Engineering: Talos
Product: Testing → mozilla.org
Version: unspecified → other
It doesn't like this will get worked on soon - moving to Future.
Component: Release Engineering: Talos → Release Engineering: Future
QA Contact: talos → release
I believe that it is the desired behavior to remove the max value to be able to ignore outliers and get a smoother, more reliable result.

This has always been the metric that we've used and it's proven to reliably find performance regressions.
Status: NEW → RESOLVED
Closed: 15 years ago
Resolution: --- → WONTFIX
I think Jeff's point is that it's not removing the outliers we think it is.  Whether or not removing outliers is a good statistical practice, I think we'd agree that what we'd like the code to be calculating is:

Page A - remove highest data point, the remaining N-1 feed into the test score
Page B - remove highest data point, the remaining N-1 feed into the test score
...

Whereas I think what he's saying is that we are currently calling getArrayStats() on the aggregate "avgs" and "medians" array, not on the data for individual pages. When we do it that way, instead of discarding abberant data points from every page, we are discarding the slowest page from the test entirely.

That is not desirable since it means we have no ability to detect regressions in that slowest page (and presumably we want to do that, otherwise it shouldn't be in the page set).  Moreover, discarding outliers on a per-page basis is likely to smooth out our data much more, since it means discarding ~400 outliers (one per page) instead of 10 (all runs for the highest page), while at the same time keeping more fidelity to the data set, since every page will be represented in the number, instead of one page being dropped.

[There is a separate discussion about whether we want to change the design to not throw away outliers, but I think that's a separate discussion, and probably for a newsgroup, not a bug, since it's a change in design, not a bug.  But I agree with Jeff that the discarding we're doing is not the discarding we should be doing.]
Status: RESOLVED → REOPENED
Resolution: WONTFIX → ---
I think that I get your point here.  The code that discards the outlier value is in the graph server, so I send it 20 individual Ts numbers and it throws away the outlier and then calculates the Ts number for that run.  It does the same with Tp results, where I send it all the medians for all the pages loaded, it throws away the highest and then calculates the Tp number for that run.

So, this would have to do with having a custom calculation per test type in the graph server - since it makes sense to throw away a high individual Ts value but doesn't make sense for a high individual Tp pageload time.
Component: Release Engineering: Future → Graph Server
Product: mozilla.org → Webtools
QA Contact: release → graph.server
Summary: Talos discards the results from the page load that took the longest → custom 'avg' calculation based upon test type
Product: Webtools → Webtools Graveyard
You need to log in before you can comment on or make changes to this bug.