[meta] Add DAMP coverage for pages with many frames
Categories
(DevTools :: General, enhancement)
Tracking
(Fission Milestone:Future)
Fission Milestone | Future |
People
(Reporter: jryans, Unassigned)
References
(Depends on 2 open bugs, Blocks 2 open bugs)
Details
(Keywords: meta, Whiteboard: fission-tests dt-fission-future)
Attachments
(5 obsolete files)
Updated•7 years ago
|
Updated•7 years ago
|
Updated•6 years ago
|
Comment 1•6 years ago
|
||
Should this test be distinct so we can track if our performance with fission frames regresses?
Comment 2•6 years ago
|
||
That's a good discussion to have.
If we want to keep damp runtime short, we would adapt existing tests to cover testing against an iframe: log console from an iframe, set breakpoint within an iframe, ...
At the same time, may be we want to track both codepaths?
Could we possible regress only the non-iframe codepath and miss that if we test all features over iframes?
Could we for example miss an optimization made on the non-iframe codepath if the iframe overload is significant, more significant than the non-iframe baseline?
Otherwise the issue with having distinct tests is that we may want to test all the features we currently test, but against iframes.
Finally, the worse would be to have no coverage at all, so we may want to be opportunistic about what we do here.
Comment 3•6 years ago
|
||
I think it makes sense to introduce iframes to the existing tests. It will be the reality going forward. I think that we might want to have an iframe stress test that is independent of the other tests, so if we regress heavily one something we can identify it..
For now I will do the first thing, which is introducing iframes and testing things within the iframes, then we can go from there.
Updated•6 years ago
|
Updated•6 years ago
|
Comment 4•6 years ago
|
||
As discussed on Slack with :yulia, I will take this one.
Comment 5•6 years ago
|
||
Ideally we would use iframes pointing to different ports/domains in our tests, but the talos server does not seem as powerful as the one we have for mochitests. For now the only URL able to serve talos test pages is http://127.0.0.1:port (port is dynamically picked depending on availability). I can spawn a second server, but the port being "random" means the test pages would also have to be generated dynamically to use the right port.
Nika, do you have any suggestions on how to test remote iframes without using different domains?
One option was to use data-url iframes. We know that right now they are not running in different processes. But if they will in the future (ie, before fission is shipped), then it might be good enough for us.
:yulia also mentioned an API to make iframes "remote", even if they are from the same domain?
Do you think one of those solutions is reliable enough for performance tests? Any other option to suggest? Or should we just find a way to have different domains? Thanks :)
Comment 6•6 years ago
|
||
(In reply to Julian Descottes [:jdescottes] from comment #5)
Ideally we would use iframes pointing to different ports/domains in our tests, but the talos server does not seem as powerful as the one we have for mochitests. For now the only URL able to serve talos test pages is http://127.0.0.1:port (port is dynamically picked depending on availability). I can spawn a second server, but the port being "random" means the test pages would also have to be generated dynamically to use the right port.
I believe that with mochitests we also only use one server, but we use some host configuration and other work to make that one server appear as multiple different servers. It would be good to add that capability into talos tests.
FYI, using different ports here would not be sufficient, as port is unfortunately ignored when using document.domain
. For this reason, multiple pages from different ports are considered "same-site" (though not "same-origin"), and have to be loaded in the same process.
Nika, do you have any suggestions on how to test remote iframes without using different domains?
One option was to use data-url iframes. We know that right now they are not running in different processes. But if they will in the future (ie, before fission is shipped), then it might be good enough for us.
As you mention, we don't currently try to run data URIs in a different process. The change to start doing work like that would likely be part of bug https://bugzilla.mozilla.org/show_bug.cgi?id=1575120, among others. In the future, it is possible that we won't always run data-url iframes in a different process, but doing so is likely.
:yulia also mentioned an API to make iframes "remote", even if they are from the same domain?
That API was super buggy, and used a completely separate set of codepaths, unfortunately. It ended up not being a good solution for testing fission. It was removed in bug 1562029. I don't have any plans to add a new API like this. I'm not sure that there is a way for it to be implemented consistently and correctly without adding a lot of complexity into the process selection logic, and not testing the actual codepaths we use in production.
Do you think one of those solutions is reliable enough for performance tests? Any other option to suggest? Or should we just find a way to have different domains? Thanks :)
I think that the most reliable solution for performance tests would be to introduce a mechanism for different domains. Even without fission, testing with a single domain may produce inaccurate results for metrics like page load of real webpages, as real webpages can contain multiple cross-origin iframes (such as adverts), and interactions with those have different perf and other behaviour properties. For example, a cross-origin iframe will create a different JS compartment, and accesses across that boundary are often slower.
Comment 7•6 years ago
|
||
Thanks a lot for the detailed answer!
We filed Bug 1583914 to see if we can get a similar setup in talos tests as the one we have for mochitests.
In the meantime, I will start updating our tests to use iframes from the same domain for now
It will already require a bit of work, as our tests make several assumptions that don't hold anymore as soon as you introduce iframes.
We are also using message manager + loadFrameScript apis to run test code in content. I will add some abstraction to avoid using those APIs directly from tests, we can then replace the implementation with something else (either custom JsWindowActors or leveraging specialpowers api if possible?).
FYI, using different ports here would not be sufficient, as port is unfortunately ignored when using document.domain.
Thanks for the info! It totally makes sense, I didn't realize that same-origin vs same-site was an important thing to keep in mind for fission.
Comment 8•6 years ago
•
|
||
Here is what I propose to implement in the scope of this bug:
- simple.* tests are not changing, to keep a consistent baseline
- all custom.* tests should use an intermediary iframe (for now same domain)
- all complicated.* tests should use an intermediary iframe (same domain)
- all other tests that rely on mm/loadFrameScript to load test code should instead use an API to run code against a given frame/browsingContext
We will need to make sure that our reloadPage helper is still accurately measuring reload time even with the iframe setup.
Most tests seem ok:
- debugger waits for expected sources, should still be accurate
- inspector waits for "new-root" + "inspector-updated", should still be accurate
- webconsole wait for N messages to be displayed, should still be accurate
- netmonitor is not waiting for anything in the reload test, but waits for N requests to finish in parallel
But some might need additional work
- styleeditor is not waiting for anything
- panels in background is not waiting for anything (it waits for requests before resuming, but this is not tracked)
Edit: After running a few DAMP comparisons with work in progress patches, it might be complicated to keep the inspector custom & complicated tests relevant if we move all the content to iframes. A big part of the test here is "how much time does it take to open/reload the inspector if the markup view has a lot of content to render". As soon as you introduce an iframe wrapper, the default markup view becomes much simpler, WE might have to created a dedicated test exercising iframes just for the inspector. See early results at https://treeherder.mozilla.org/perf.html#/comparesubtest?originalProject=try&newProject=try&newRevision=cb40b541374bfc2adade842ee64fbabf88b3abfa&originalSignature=1759151&newSignature=1759151&framework=12&originalRevision=a80db7cf21c735363636b11e30f4a6f09009994b
Comment 9•6 years ago
|
||
Comment 10•6 years ago
|
||
Depends on D47948
Comment 11•6 years ago
|
||
Depends on D47950
Comment 12•6 years ago
|
||
Depends on D47951
Comment 13•6 years ago
|
||
The inspector custom test is duplicated to have one version with iframes and one version without iframes.
Comment 14•6 years ago
|
||
Removing myself from this bug.
It doesn't seem like this can move forward until we have both a server (Bug 1583914) and ContentTask-like APIs in DAMP to communicate with oop iframes.
Updated•6 years ago
|
Updated•6 years ago
|
Comment 15•6 years ago
|
||
Turning this bug into a meta bug so that individual tools can file specific bugs to add specific test scenarios.
Updated•6 years ago
|
Updated•6 years ago
|
Updated•6 years ago
|
Updated•6 years ago
|
Comment 16•6 years ago
|
||
Comment on attachment 9099487 [details]
Bug 1589011 - Use iframes in DAMP custom.* tests
Revision D48500 was moved to bug 1589011. Setting attachment 9099487 [details] to obsolete.
Comment 17•6 years ago
|
||
Enabling Talos is a prerequisite for enabling Fission in Nightly (M6).
Updated•6 years ago
|
Updated•6 years ago
|
Updated•6 years ago
|
Updated•6 years ago
|
![]() |
||
Updated•5 years ago
|
Comment 18•5 years ago
|
||
Tracking for Fission M7 Beta. Adding new DevTools perf tests doesn't need to block Fission Nightly.
Comment 19•5 years ago
|
||
Adding dt-fission
whiteboard tag to DevTools bugs that mention Fission or block Fission meta bugs but don't already have a dt-fission
whiteboard tag.
Comment 20•5 years ago
|
||
Julian, are you actively working on this? This is blocking running Talos with Fission-enabled (see https://bugzilla.mozilla.org/show_bug.cgi?id=1585344#c6).
Comment 21•5 years ago
|
||
I am not assigned to this and not actively working on it. I don't think this meta should block Bug 1585344.
This bug is about adding more coverage for fission scenarios in our DAMP tests.
I don't know if DAMP is broken when Fission is enabled, but if it is, then we should file a bug to fix that, block Bug 1585344 on it (and track the bug in DevTools' Fission project etc...)
I'll file a placeholder and update the dependencies.
Comment 22•5 years ago
|
||
Also note that in general we plan to work on DevTools performance issues with Fission for devtools-fission milestone 3.
Comment 23•5 years ago
|
||
DevTools meta bugs don't need to block Fission MVP. However, we will continue to monitor these meta bugs for any new bugs blocking them as potential blockers for Fission MVP.
Updated•5 years ago
|
Updated•4 years ago
|
Comment 24•4 years ago
|
||
dt-fission-future
bugs don't need to block Fission MVP, so I'm moving them from Fission Milestone MVP to Future.
Updated•4 years ago
|
Updated•4 years ago
|
Updated•3 years ago
|
Description
•