Closed
Bug 570385
Opened 15 years ago
Closed 15 years ago
iteration doesn't detect all deleted properties
Categories
(Core :: JavaScript Engine, defect)
Tracking
()
RESOLVED
FIXED
Tracking | Status | |
---|---|---|
blocking2.0 | --- | final+ |
People
(Reporter: igor, Assigned: dmandelin)
References
Details
(Whiteboard: fixed-in-tracemonkey)
Attachments
(1 file)
865 bytes,
patch
|
igor
:
review+
|
Details | Diff | Splinter Review |
From the bug 569735 comment 27:
The patch does not check for enumerators in DeleteArrayElement for the dense
arrays. So the output for the following script:
var a = [0, 1];
for (var i in a) {
print(i);
a.pop();
}
is 0, 1 while for
var a = [0, 1];
for (var i in a) {
print(i);
delete a[1];
}
it is just 0. The output is also zero if in the first example the array is slow one like in:
var a = [0, 1];
a.x = 10;
delete a.x;
for (var i in a) {
print(i);
a.pop();
}
Comment 1•15 years ago
|
||
Dense arrays are special when it comes to iteration. We already broke property enumeration order for them (so does everyone else). I want to make a specialized iterator for them that uses a counter instead of a list of ids, so I really don't want to support deleted property suppression for dense arrays if the web doesn't need it and so far there is no indication it does. IMO, WONTFIX.
Comment 2•15 years ago
|
||
How hard with a specialized iterator would it be to suppress deletes (whether by pop or delete, that's just a bug in the current code)? The jsxml.cpp enumeration code uses a counter instead of a list of ids, keeps track of live "cursors" for a given XML array, and seems to demonstrate that no list of ids is needed to cope with delete coherence for a dense array-like.
/be
Comment 3•15 years ago
|
||
If its a delete "in the future" (beyond cursor, not under the cursor), you can't express deletion suppression with a simple begin,cursor,end pattern. We could de-specialize to a regular iterator in this case, but again, iteration over numeric properties is a sufficiently different animal. This isn't generic deletion suppression. I really would like to try my luck getting away with as little insanity here as necessary.
Reporter | ||
Comment 4•15 years ago
|
||
(In reply to comment #1)
> I want to make a
> specialized iterator for them that uses a counter instead of a list of ids, so
> I really don't want to support deleted property suppression for dense arrays if
> the web doesn't need it and so far there is no indication it does.
Yet the counter would need to check for holes in the dense array which are precisely the deleted properties. So the counter approach would deviate from Ecma only regarding newly added properties as it would not know that an element was a hole before.
Comment 5•15 years ago
|
||
Deletion from arrays is exceedingly rare, if I recall my shavarray instrumentation correctly. Deleting from an array under iteration would seem even more so; IMO we should close this WONTFIX and see if it ever affects the web. We can reopen if it does.
Comment 6•15 years ago
|
||
There are two things to avoid losing while we chop up cases in the bugzilla mix-master:
1. Sane semantics we can propose to Ecma TC39.
2. Backward compatibility.
For (1) at the last TC39 meeting, we proposed the Snapshot model, _sans_ delete suppression, and said we would try it out in Firefox nightlies. We've backed off now on eliminating delete suppression but only for the direct object, and here not for dense arrays.
While delete suppression may be required for direct properties of some user-defined objects on the Web, this latest exception for dense arrays does not look like a good proposal for the standard.
Leaving things underspecified selectively is possible but messy if we don't do it thoughtfully. For example, if we dealt with all arraylikes, or even all indexed properties, uniformly, that would be better. An implementation detail like when a dense array becomes slow is a poor determinant of either over- or selective under-specification.
For (2) we could roll the dice. It might be we seem to get away with it, and only find out much later what broke. Or it might be that nothing breaks.
Let's talk about (1) a bit before WONTFIXing.
/be
Comment 7•15 years ago
|
||
(In reply to comment #3)
> If its a delete "in the future" (beyond cursor, not under the cursor), you
> can't express deletion suppression with a simple begin,cursor,end pattern.
True -- E4X child removal does compress, in contrast. Holey arrays suck! OTOH, why can't we just slowify on delete? Do benchmarks or important perf-sensitive programs that use dense arrays ever delete from them?
/be
Comment 8•15 years ago
|
||
Always excluding indexed properties sounds pretty good to me.
Comment 9•15 years ago
|
||
(In reply to comment #8)
> Always excluding indexed properties sounds pretty good to me.
From delete suppression, you must mean...
/be
Reporter | ||
Comment 10•15 years ago
|
||
(In reply to comment #5)
> Deletion from arrays is exceedingly rare, if I recall my shavarray
> instrumentation correctly. Deleting from an array under iteration would seem
> even more so; IMO we should close this WONTFIX and see if it ever affects the
> web. We can reopen if it does.
Currently on the TM tip the fast arrays and slow arrays behaves differently regarding this. That is, a fast arrays does not suppress deleted properties while a slow one does. At least we should make the array behavior uniform suppression or not.
Updated•15 years ago
|
blocking2.0: ? → final+
Assignee | ||
Comment 11•15 years ago
|
||
A related bug bit us compatibility-wise in bug 595963, so it seems worth fixing. And I think that bug makes it pretty easy.
Reporter | ||
Updated•15 years ago
|
Attachment #492004 -
Flags: review?(igor) → review+
Assignee | ||
Comment 12•15 years ago
|
||
Whiteboard: fixed-in-tracemonkey
Comment 13•15 years ago
|
||
Status: ASSIGNED → RESOLVED
Closed: 15 years ago
Resolution: --- → FIXED
You need to log in
before you can comment on or make changes to this bug.
Description
•