Closed
Bug 1178283
Opened 9 years ago
Closed 9 years ago
add a new updated page for testing mobile performance
Categories
(Testing Graveyard :: Autophone, defect)
Testing Graveyard
Autophone
Tracking
(Not tracked)
RESOLVED
FIXED
People
(Reporter: jmaher, Assigned: jmaher)
References
Details
Attachments
(1 file)
currently we have cnn.com from 2012 to test robocop tcheckerboard2. In moving to autophone, we need to update a few things, and we might as well update our test page. alexa site has yahoo.com as #5 and amazon.com as #6. Maybe we should take a snapshot there?
Assignee | ||
Comment 1•9 years ago
|
||
I understand eideticker uses the same cnn.com, so we should think about both tests. snorp, blassey, any thoughts?
Flags: needinfo?(snorp)
Flags: needinfo?(blassey.bugs)
Isn't there a list of sites that the '60 fps' initiative is using? Where is that list? We should probably use facebook or twitter?
Flags: needinfo?(snorp)
Comment 3•9 years ago
|
||
As I recall, tcheckerboard2 does a lot of panning. We should make sure that the new content is sufficiently "tall" to exercise that.
Assignee | ||
Comment 4•9 years ago
|
||
my only hiccup with facebook/twitter is that we need an account vs a live static page. open to ideas.
Comment 5•9 years ago
|
||
Naveed, do you know what page(s) we're targeting for 60fps?
Flags: needinfo?(blassey.bugs) → needinfo?(nihsanullah)
Assignee | ||
Comment 6•9 years ago
|
||
tcheck2 is mostly panning and zooming, I assume that will serve a dual purpose with 60fps. When I see 60fps, it implies some kind of moving graphics- this test is currently a static page.
Assignee | ||
Updated•9 years ago
|
Assignee: nobody → jmaher
Comment 7•9 years ago
|
||
https://wiki.mozilla.org/Platform/60fps#Sites_we_care_about The first three are the same as the Firefox Content Performance Program. Facebook uses President Obama's public profile for their own internal perf testing so it makes sense for us to start there.
Flags: needinfo?(nihsanullah)
Assignee | ||
Comment 8•9 years ago
|
||
pulling this page down locally yields a single file. I do: wget --user-agent=Firefox -p -k -e robots=off http://www.facebook.com/barackobama and get: ls -la www.facebook.com/ total 640 drwxrwxr-x 2 jmaher jmaher 4096 Jul 6 19:04 . drwxrwxr-x 3 jmaher jmaher 4096 Jul 6 19:04 .. -rw-rw-r-- 1 jmaher jmaher 643878 Jul 6 19:04 barackobama this might be fine, but I think the use case of many graphics, files, etc. would be more useful. Loading it locally yields a handful of requests to facebook, maybe we could pull those images and css files down locally and fix up the file. Does this sound right?
Flags: needinfo?(wlachance)
Comment 9•9 years ago
|
||
(In reply to Joel Maher (:jmaher) from comment #8) > pulling this page down locally yields a single file. I do: > wget --user-agent=Firefox -p -k -e robots=off > http://www.facebook.com/barackobama > > ... > this might be fine, but I think the use case of many graphics, files, etc. > would be more useful. Loading it locally yields a handful of requests to > facebook, maybe we could pull those images and css files down locally and > fix up the file. > > Does this sound right? That's what I would do, though I wonder if better approaches are possible. Also, I don't want to duplicate work that others are doing elsewhere. Avi, are you pulling down a static copy of the site for the content performance program?
Flags: needinfo?(wlachance) → needinfo?(avihpit)
Comment 10•9 years ago
|
||
(In reply to Joel Maher (:jmaher) from comment #6) > tcheck2 is mostly panning and zooming, I assume that will serve a dual > purpose with 60fps. When I see 60fps, it implies some kind of moving > graphics- this test is currently a static page. It doesn't necessarily mean that stuff on the page is animating - we're _mostly_ interested in scrolling/panning performance so we only have to make sure the page is long enough* for our scroll test. * enough - depending on how much the tcheckerboard2 test scrolls/pans, and possibly other tests where we want to use these pages. (In reply to Naveed Ihsanullah [:naveed] from comment #7) > https://wiki.mozilla.org/Platform/60fps#Sites_we_care_about > > The first three are the same as the Firefox Content Performance Program. > > Facebook uses President Obama's public profile for their own internal perf > testing so it makes sense for us to start there. Sounds good to me. (In reply to William Lachance (:wlach) from comment #9) > (In reply to Joel Maher (:jmaher) from comment #8) > > pulling this page down locally yields a single file. I do: > > wget --user-agent=Firefox -p -k -e robots=off > > http://www.facebook.com/barackobama > > > > ... > > this might be fine, but I think the use case of many graphics, files, etc. > > would be more useful. Loading it locally yields a handful of requests to > > facebook, maybe we could pull those images and css files down locally and > > fix up the file. > > > > Does this sound right? > > That's what I would do, though I wonder if better approaches are possible. > Also, I don't want to duplicate work that others are doing elsewhere. Avi, > are you pulling down a static copy of the site for the content performance > program? We still don't have a reference set of pages. We've been experimenting with some online facebook and twitter pages. As for how to grab the page, I'd try the following: - Browse the page using Firefox. - Scroll down a bit. - "Save As" -> "Complete page". And then make sure that the captured content is enough for the tests we want it for - while network access is blocked. Maybe then modify the captured content somehow such that it never tries to contact any servers - such that we can also run the test on systems where we do have network access - but still get the same results as in environments where there's no network (e.g. talos). If that doesn't work, we'd need to rethink it. Overall though, the capture method doesn't matter much as long as we end up with reasonably good content, and we only need to do the capturing once.
Flags: needinfo?(avihpit)
Assignee | ||
Comment 11•9 years ago
|
||
file->save as doesn't really do a good job, I will use wget and modify as needed to avoid network access.
On a side note, I wish we had a better way to capture (and replay) pages. Putting all resources under one host and rewriting all the html can really change the performance characteristics.
Assignee | ||
Comment 13•9 years ago
|
||
we don't have a lot of concrete direction here, but I have a deliverable to get tcheck2 onto autophone and this is the first pre-requisite in a long string. It seems that this page (facebook.com/barackobama) is useful. In the future if we want to create a server for async transactions or other network shaping that could be done by interested parties and we could look into incorporating that into our workflow. For tier 1 jobs that has a high bar as we have no external network access.
Attachment #8631658 -
Flags: review?(wlachance)
Comment 14•9 years ago
|
||
Comment on attachment 8631658 [details]
ls -laR ep1/facebook.com
What's with the barackobama.orig files? I assume you don't want to add that.
Other than that, I'm fine with landing this.
Attachment #8631658 -
Flags: review?(wlachance) → review+
Assignee | ||
Comment 15•9 years ago
|
||
added to git: https://git.mozilla.org/?p=automation/ep1.git;a=commit;h=ac46fa13b2428978af1ac886242c3703fddfd0e1
Status: NEW → RESOLVED
Closed: 9 years ago
Resolution: --- → FIXED
Updated•2 years ago
|
Product: Testing → Testing Graveyard
You need to log in
before you can comment on or make changes to this bug.
Description
•