TypeInference: reading holes is slower with type inference

RESOLVED FIXED

Status

()

Core
JavaScript Engine
RESOLVED FIXED
7 years ago
7 years ago

People

(Reporter: jandem, Assigned: bhackett)

Tracking

(Blocks: 1 bug)

Firefox Tracking Flags

(Not tracked)

Details

(Whiteboard: fixed-in-jaegermonkey)

Attachments

(1 attachment)

210 bytes, application/x-javascript
Details
(Reporter)

Description

7 years ago
I noticed this when profiling Kraken's ai-astar.

For the attached test case:
- JM: 87 ms
- JM (inference): 107 ms

When I change "x = 0" to "x = 1" (eg. reading an int):
- JM: 17 ms
- JM (inference): 13 ms

<bhackett> two ways to fix
<bhackett> 1. do what we do for 3d-raytrace, if testing a property like if (x[i]) assume the result could be undefined, and handle that in the compiler
<bhackett> 2. only add undefined to reads of arrays where undefined has actually been observed
...
<bhackett> we should do both 1. and 2. above
<bhackett> the cost of recompiling more often is less than the cost of less precision
(Reporter)

Comment 1

7 years ago
Created attachment 497781 [details]
Micro-benchmark
Blocks: 619425
No longer blocks: 608741
(Assignee)

Comment 2

7 years ago
I was wrong on this, we actually were marking undefined as a possible result of 'if (a[x])' so did not recompile here.  The problem is that we slow path every hole read out of the array, and that slow path now has to check that the inference knows about the undefined value every time it pushes undefined.  This patch fixes this by fast-pathing hole reads out of an array when (a) the pushed value is known to contain undefined, and (b) Array.prototype and Object.prototype do not have indexed properties.

http://hg.mozilla.org/projects/jaegermonkey/rev/6e0795e82953

This doesn't make much difference on ai-astar.  The problem there is that having more type information ends up generating much worse code on object equality; I filed bug 619592 for that issue.
(Assignee)

Updated

7 years ago
Status: ASSIGNED → RESOLVED
Last Resolved: 7 years ago
Resolution: --- → FIXED
Whiteboard: fixed-in-jaegermonkey
(Reporter)

Comment 3

7 years ago
FWIW, for the attached microbenchmark:

- JM: 87 ms
- JM (inference): 14 ms (was 107 ms)

Nice!
You need to log in before you can comment on or make changes to this bug.