Closed Bug 1503990 Opened 6 years ago Closed 6 years ago

add additional pages to tp6

Categories

(Testing :: Raptor, enhancement)

Version 3
enhancement
Not set
normal

Tracking

(firefox65 fixed)

RESOLVED FIXED
mozilla65
Tracking Status
firefox65 --- fixed

People

(Reporter: jmaher, Assigned: Bebe)

References

Details

Attachments

(1 file)

in working towards the top 25 sites we want to track with tp6 (potentially >25 in the future), we have some sites that are good to measure that do not require login and we are not already measuring:

https://en.wikipedia.org/wiki/Barack_Obama
https://ca.news.yahoo.com/adopted-great-dane-teaches-puppy-000002643.html
https://www.reddit.com/r/technology/comments/9sqwyh/we_posed_as_100_senators_to_run_ads_on_facebook/
https://www.twitch.tv/videos/326804629
https://yandex.ru/search/?text=barack%20obama&lr=10115
www.bing.com/search?q=barack+obama
https://www.microsoft.com/en-us/windows/get-windows-10
http://fandom.wikia.com/articles/fallout-76-will-live-and-die-on-the-creativity-of-its-playerbase
https://www.vice.com/en_us/article/j53a8d/four-college-freshmen-photograph-their-first-semester-v25n3
https://www.imdb.com/title/tt0084967/?ref_=nv_sr_2
https://imgur.com/gallery/m5tYJL6
https://www.apple.com/macbook-pro/

for these pages we want to measure:
measure = fnbpaint, dcf, ttfi

Once we get this up, we will either do additional pages that we will have login information for, or add hero elements.  As it stands, hero element is still useful as we can define a point we want to measure until.  For the initial work, we do not have hero element defined and we can revisit these sites once defined or start proposing locations for the hero element.
:bebe, if you can pick this up next week that would be execellent
Flags: needinfo?(bebe)
Blocks: 1473078
I would suggest splitting these new pages up into at least two new raptor tests, 'tp6-2' and 'tp6-3'. This way we keep the overall job duration reasonable. The current 'tp6' job ranges from approx 20-24 min (longest on Win) and that's with the 4 existing pages.
maybe we can do:
tp-search (google, yandex, bing)
tp-news (yahoo, vice, wikipedia)
tp-docs (gdocs, gsheets, gslides)
tp-social (facebook, reddit, twitch)
tp-shop (amazon, apple, microsoft)
tp-media (imgur, youtube, imdb, wikia)


that would incorporate our existing runs, in fact these could be a in tp group TP-e10s(search, news, docs, social, shop, media) on treeherder :)

:sphilp, any thoughts on how to bucket these into logical smaller runtime chunks?
Flags: needinfo?(sphilp)
:rwood
I created this script [1] to generate the .mp file and save the website. The script has a automated side and a manual side.
The steps are:
  1. Setup the proxy for recording             (Automated)
  2. Open the website                          (Automated)
  3. Save the .mp file                         (Automated)
  4. Prompt to disable internet connection     (Manual)
  5. Setup the proxy for replay                (Automated)
  6. Open website                              (Automated)
  7. Prompt to save the website                (Manual)

We can update the script to do auto login and other features that are requested.
Also I think we ca update the script to add #hero elements in the recorded .mp files.

Playback with network connection works as expected.

My issue with the script is that I get an error when replaying the .mp files with no network connection.
We see a error [2] with files recorded with my script and already recorded .mp from the repo. I tried with both script and manual setup playback. In both I used the suggested script for playback. Also I see this error when running the raptor tests with no network connection.

Can you suggest any feature steps to continue?


[1] https://github.com/bebef1987/dump_selenium_proxy
[2] https://www.screencast.com/t/G5X77fqONh
Flags: needinfo?(bebe) → needinfo?(rwood)
FYI, I see the same error (502 Bad Gateway) when trying this manually outside of Raptor (start mitmdump and playback a tp6 recording .mp; start Firefox nightly with the mitmproxy CA cert installed; browse to the recording URL). I have done this in the past and it worked (with the same version of mitmdump).
Flags: needinfo?(rwood)
this implies firefox?  should we look at google chrome to see if it works for us, then bisect/debug firefox?
Yep just tried it in Google Chrome - start mitmdump playing back the amazon tp6 recording .mp; startup up Chrome; turned on proxy settings in Chrome and applied; browsed to 'mitm.it' and installed the mitmproxy CA cert (follow instructions); turned off my local WiFi connection; browsed to the tp6 amazon recording URL; and the page loaded fine in chrome.
(In reply to Robert Wood [:rwood] from comment #7)
> Yep just tried it in Google Chrome - start mitmdump playing back the amazon
> tp6 recording .mp; startup up Chrome; turned on proxy settings in Chrome and
> applied; browsed to 'mitm.it' and installed the mitmproxy CA cert (follow
> instructions); turned off my local WiFi connection; browsed to the tp6
> amazon recording URL; and the page loaded fine in chrome.

UGH wait, sorry, I had an ethernet cable connected to my macbook and ddin't realize, and I DO see the same Bad Gateway error in Chrome when running Raptor (and turning off wifi at the 30 second post-browser startup pause). Maybe when I tried it before with Firefox I also had an ethernet cable connected by mistake.
I'm not sure why mitmproxy requires an internet connection to playback recordings, but I don't think testing playback without an internet connection is required. With WiFi enabled you can still test recordings.

If you turn on the proxy in Firefox (and have the mitmproxy CA cert installed already) but don't start playback, and attempt to browse to the recording URL i.e. for amazon [1], you'll get the 'proxy server is refusing connections' error as expected.

If you then start playback i.e. [2] then refresh the page / browse to the same recorded URL, then you'll see the page load. In the terminal you'll see a bunch of "found exact replay match" messages for the various page elements.

If you playback our existing recordings in this method, and compare with the same URL but without the proxy and mitmdump playabck, you'll see the recorded page content vs the current live page.

[1] https://www.amazon.com/s/url=search-alias%3Daps&field-keywords=laptop

[2] ./mitmdump -s " /Users/rwood/mozilla-unified/testing/raptor/raptor/playback/alternate-server-replay.py /Users/rwood/mozilla-unified/obj-x86_64-apple-darwin17.7.0/testing/raptor/amazon.mp"
(In reply to Joel Maher ( :jmaher ) (UTC-4) from comment #3)
> maybe we can do:
> tp-search (google, yandex, bing)
> tp-news (yahoo, vice, wikipedia)
> tp-docs (gdocs, gsheets, gslides)
> tp-social (facebook, reddit, twitch)
> tp-shop (amazon, apple, microsoft)
> tp-media (imgur, youtube, imdb, wikia)
> 
> 
> that would incorporate our existing runs, in fact these could be a in tp
> group TP-e10s(search, news, docs, social, shop, media) on treeherder :)
> 

Personally I like the idea of just having numbers i.e. tp6-1, tp6-2, etc. as that leaves it more open when adding new pages, or moving pages from one test job to another, swapping a page out for a different one, etc.

However if everyone else prefers categories no worries, but I would suggest they be numbered like tp6-search-1, etc. because at some point we will reach the job time limit and need a second test job for the same category I would expect. :) I do see the category advantage as a bit easier to find which page is in which job, although we could also just be sure to update the raptor wiki test list and list each page in each test job too.
that is a valid point- the numbers make sense- I find they can be confusing, but my suggestion isn't scalable.
These are just job names? I'm fine with tp6-1, tp6-2 etc. They are not going to move around like chunks tho I take it? As in, it will be consistent sites per group? As long as we have that documented I think it's fine
Flags: needinfo?(sphilp)
(In reply to Stuart Philp :sphilp from comment #13)
> These are just job names? I'm fine with tp6-1, tp6-2 etc. They are not going
> to move around like chunks tho I take it? As in, it will be consistent sites
> per group? As long as we have that documented I think it's fine

Ok, thanks Stuart. Yes that's correct - the sites will be consistent in each group.
Assignee: nobody → bebe
Blocks: 1505521
See Also: → 1505526
I have filed Bug 1505526 to rename the existing raptor tp6 and gdocs tests to follow the new naming convention (tp6-X). It will be easier if this work is done in a separate bug, then :bebe can focus on adding the new pagesets/jobs.

FYI :bebe, as noted in the phab review comments, please start your new tp6-* suites at tp6-3. Thanks! :)
Flags: needinfo?(bebe)
Blocks: 1505788
For an example of adding taskcluster configs for Raptor, see Bug 1505526.
(In reply to Robert Wood [:rwood] from comment #16)
> For an example of adding taskcluster configs for Raptor, see Bug 1505526.

Ignore that comment wrong bug (meant that for Bug 1505521)
Depends on: 1507436
Attachment #9022991 - Attachment is obsolete: true
Attachment #9022991 - Attachment is obsolete: false
Attachment #9022991 - Attachment is obsolete: true
Attachment #9022991 - Attachment is obsolete: false
Attachment #9022991 - Attachment is obsolete: true
Attachment #9022991 - Attachment is obsolete: false
Just fyi, filed github issue for the addition of these new test suites to the perf dashboard:

https://github.com/mozilla-frontend-infra/firefox-health-dashboard/issues/204
Pushed by rwood@mozilla.com:
https://hg.mozilla.org/integration/autoland/rev/36aafe507187
add additional pages to tp6 r=rwood
https://hg.mozilla.org/mozilla-central/rev/36aafe507187
Status: NEW → RESOLVED
Closed: 6 years ago
Resolution: --- → FIXED
Target Milestone: --- → mozilla65
Remove :needinfo
Flags: needinfo?(fstrugariu)
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Created:
Updated:
Size: