Open
Bug 1495752
Opened 6 years ago
Updated 2 years ago
chromeperf page janks hard on loading (inserting 2k+ elements with the same ID on the document)
Categories
(Core :: DOM: Core & HTML, defect, P2)
Core
DOM: Core & HTML
Tracking
()
NEW
People
(Reporter: mayankleoboy1, Unassigned)
References
()
Details
User Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:64.0) Gecko/20100101 Firefox/64.0
Build ID: 20181002100236
Steps to reproduce:
go to https://chromeperf.appspot.com/group_report?rev=489898
Actual results:
massive jank after load is finished. Memory also goes up to 800MB
https://perfht.ml/2DQReS4
Expected results:
not so
Reporter | ||
Updated•6 years ago
|
Comment 1•6 years ago
|
||
CCing Olli since I see nsINode::ComputeIndexOf in the profile.
Comment 2•6 years ago
|
||
The call stack is clearly something in the DOM...
It looks like there are lots of elements with the same ID, and thus we need to compare their tree position in order to correctly insert them into the right position of the list, which can be very slow.
I guess we can probably try to do this lazily instead: when a ID query to the map is placed, we scan the tree and cache the element. When an element with an ID in the table is bound, or the element in the table is unbound, we invalidate that item.
That would probably be more performant than trying to keep the table up-to-date whenever we bind / unbind elements.
Component: Layout → DOM: Core & HTML
Comment 3•6 years ago
|
||
(In reply to Xidorn Quan [:xidorn] UTC+10 from comment #2)
> That would probably be more performant than trying to keep the table
> up-to-date whenever we bind / unbind elements.
That just doesn't work to make querySelector / getElementById / etc fast. If you can't guarantee that the table is up-to-date you just need to scan the whole tree all the time.
Comment 4•6 years ago
|
||
That's how the invalidation comes to work. We don't need to rescan the tree unless the cached item is invalidated, and we only need to invalidate it in very few cases. In those cases, we already need to try inserting them into the table anyway, so that process itself wouldn't make anything slower.
Comment 5•6 years ago
|
||
There are three duplicated ids:
container => 2 times
spinnerContainer => 2342 times
toasts => 2 times
Comment 6•6 years ago
|
||
Oh, maybe there is some misunderstanding. By the last paragraph of comment 2, I meant that it should be more performant than trying to keep the table comprehensive. But we do need to keep it up-to-date for existing items.
Comment 7•6 years ago
|
||
Ah, I see what you mean. Just took a look, what blink does for this is ordering the elements lazily, which is somewhat similar to what you're proposing:
https://cs.chromium.org/chromium/src/third_party/blink/renderer/core/dom/tree_ordered_map.h
It doesn't make me happy that all our querySelector and such methods can mutate that under the hood, but I guess it can be fine...
Updated•6 years ago
|
Summary: chromeperf page janks hard on loading → chromeperf page janks hard on loading (inserting 2k+ elements with the same ID on the document)
Updated•6 years ago
|
Priority: -- → P2
Comment 8•6 years ago
|
||
Aha, for AddIdElement case we can probably use the existing setup for using a local cache.
Comment 9•6 years ago
|
||
Er, hmm, no, that doesn't really work here.
Reporter | ||
Comment 10•6 years ago
|
||
see also : bug 1464632, which appears to have a similar-ish profile
See Also: → 1464632
Updated•2 years ago
|
Severity: normal → S3
You need to log in
before you can comment on or make changes to this bug.
Description
•