Testcase creating Map keys is 4x slower, and iterating them is 2x slower
Categories
(Core :: JavaScript Engine, task, P3)
Tracking
()
People
(Reporter: mayankleoboy1, Unassigned)
References
(Blocks 2 open bugs)
Details
Attachments
(1 file)
|
900 bytes,
text/html
|
Details |
Default testcase., do both operations
- Firefox: https://share.firefox.dev/4r7n8go (50s)
- Chrome: https://share.firefox.dev/4kqN2ta (25s)
5000000 keys only. Dont do iterations.
- Firefox: https://share.firefox.dev/4kqOE6a (2.5s)
- Chrome: https://share.firefox.dev/4afB8NJ (600ms)
Artificial testcase.
Comment 1•2 days ago
|
||
Looks like both versions are dominated by MapObjectSet. In particular, we spend 60% of our time in OrderedHashTableImpl::lookup, of which 2/3 (40% of the total) is just spent comparing values for equality. Hashing is relatively cheap (1% of total time). Rehashing when full is 11% of total time.
In V8, the default profile shows that they also spend all their time in Map.set, but they do it all in jitcode. In the larger testcase, they do have to call into C++ to reallocate the map. Notably, it's a larger fraction of their overall time. I think this implies that their Map.set is otherwise more efficient.
I did a quick experiment to see how well our hash function is distributing things. Looking at the distribution of chain lengths on the last rehash, I see:
78390 0
196779 1
254839 2
224171 3
150664 4
82527 5
37824 6
15462 7
5487 8
1723 9
515 10
147 11
36 12
9 13
1 14
2 15
So the most common chain length is 2, and long chains are rare. We're not hitting a pathological case here.
Speeding this up would presumably involve either redesigning our data structures, or implementing Map.set in jitcode. Those are both big enough projects that I don't think we should prioritize them unless we see Map.set as a bottleneck in real code.
Description
•