Closed Bug 419105 Opened 17 years ago Closed 16 years ago

Create mini talos run

Categories

(Release Engineering :: General, defect, P3)

x86
macOS
defect

Tracking

(Not tracked)

RESOLVED WONTFIX

People

(Reporter: mikeal, Unassigned)

References

Details

Task to create a shorter talos run that can still find the max amount of regressions in the shortest total run time. This task requires; -Find the "most interesting" pages from the existing talos runs -Determine the optimum amount of pages to satisfy usefullness and speed requirements -Run tests in isolation for a time to determine whether they are in fact effective.
Priority: -- → P3
This looks like the best 10 to start with; www.washingtonpost.com www.pconline.com.cn www.mobile.de www.wretch.cc cn.msn.com www.wangyou.com www.wikipedia.org www.invisionfree.com www.tianya.cn www.qihoo.com
(In reply to comment #1) > This looks like the best 10 to start with; Looks like a nice list, multiple languages, multiple layout styles, but I wonder if you could give any insight into what made these the 10 best? Was it purely statistical, were there explicit criteria?
these were based on the largest variances across all platforms for data in graphs.mozilla.org.
To be more specific, I pulled info out of the graph server in to a local db scheme that was significantly richer. First I took just the mini- hardware data did the following. 1. Calculated the average time to fall runs per page. 2. Took the highest 100 and lowest 100 results of each page, averaged it, and stored the differential between that number and the average. 3. Then I look through the biggest differentials on the high and low end, and took 7 of the 10 sites in that list. Then I took the rest of the tp_loadtime data out of the graph server, no just the mini- hardware, and computed the same as before but did it per hardware. I sorted through all that data and picked out the 3 that were showing up the most and weren't showing up in the mini- only hardware. We may increase the amount of sites in this list, there is plenty of data if we ever want to pull more. I just want to see how long these 10 take to run and see what kind of stuff they're catching first.
typo: 1. Calculated the average time to for all runs per page.
Priority: P3 → P2
Depends on: 421385
Component: Testing → Release Engineering
Product: Core → mozilla.org
Version: unspecified → other
QA Contact: testing → release
Priority: P2 → P3
For refernces, the _next_ 20 "most interesting" sites would be: www.asahi.com www.nikkansports.com www.verycd.com www19.dantri.com.vn www.nikkei.co.jp www.pcpop.com www.jrj.com.cn www.terra.com.br www.expedia.com www.yandex.ru www.typepad.com www.webs-tv.net www.cj.com www.webs-tv.net www.kooora.com www.weather.com tag.seesaa.jp www.it168.com www.265.com www.clarin.com www.qihoo.com
I've uploaded a tarball of mess of code I wrote to get these sites in case we ever need to come back to it. It's in my user directory on the office fileserver.
Assignee: mrogers → nobody
Component: Release Engineering: Talos → Release Engineering: Future
Instead of working on mini talos I've constructed a new 100 page set (bug 473821) - that will give us good perf data and run in far less time than the 400 page set currently in use.
Status: NEW → RESOLVED
Closed: 16 years ago
Resolution: --- → WONTFIX
Moving closed Future bugs into Release Engineering in preparation for removing the Future component.
Component: Release Engineering: Future → Release Engineering
Product: mozilla.org → Release Engineering
You need to log in before you can comment on or make changes to this bug.