Closed Bug 688852 Opened 13 years ago Closed 8 years ago

Investigate integer overflow in array_concat

Categories

(Core :: JavaScript Engine, defect)

6 Branch
x86
All
defect
Not set
normal

Tracking

()

RESOLVED WORKSFORME

People

(Reporter: bsterne, Assigned: terrence)

References

(Blocks 1 open bug)

Details

Attachments

(4 files, 7 obsolete files)

Fabian Yamaguchi of Recurity Labs on behalf of BSI (German Federal Office for Information Security) reported the following to security@mozilla.org:

Function: array_concat

Component:
The bug is contained in the JavaScript array support and in particular in function array_concat.

Summary:
The length field of a maximum­sized JavaScript array overflows when joining it with another array using method concat.

Overview:
JavaScript arrays have a maximum size of 4294967295 and accordingly an unsigned 32 bit integer is used to store the array length field. When using method concat to join an array of maximum size with another array, the length field overflows yielding a much smaller array. While this does not automatically lead to a vulnerability, it is an unexpected result and should thus be fixed. Note, that creating an array of this size does not automatically lead to the allocation of a consecutive area of memory with the corresponding size and therefore the bug can be triggered on 32 bit as well as 64 bit platforms.

Steps to Reproduce:
Load the following HTML file in a browser:
  <script>
  N = 4294967295;
  a = new Array(N);
  b = new Array(2);
  c = a.concat(b);
  alert(c.length);
  </script>

This will output a length of 2 for the array c.
This looks like a reasonable way to get familiar with JS idioms for working with values, objects, etc.  It seems like a nice thing for Terrence to start with, then -- also a chance to gain familiarity with the ECMAScript spec steps, perhaps.

Ordinarily for these sorts of things I'm a fan of reordering to indicate exact steps followed, but since the nature of this (integer overflow, even if not a security vulnerability) is a bit dodgier than most of these, we probably should fix the integer overflow here, then rewrite in stepwise order (which looks pretty straightforward at a glance) in a new bug.
Attached patch WIP: revamp array_concat (obsolete) — Splinter Review
Assignee: general → tcole
Status: NEW → ASSIGNED
Attachment #562909 - Flags: feedback?(jwalden+bmo)
Results so far:
Opera: overflows (same behaviour as current firefox)
Chrome: clips to 2**32-1, does not throw
More Results:
ID: clips to 2**32-1, does not throw (same as Chrome)
This test also scans what is present in the array after the last concat.

The behaviour I expect for strict conformance to the spec is:  
 - The variable "n" in the spec should not overflow at uint32 bounds.
 - Once it goes over uint32 size, it is no longer an "element" key, it is a property.
 - Assignment to the "length" property "saturates" at the maximum length value, UINT32_MAX; e.g. A.setOwnProperty('length', MIN(n, UINT32_MAX))
 - Assignment to "length" of a value smaller than the last element will remove all *elements* in the array larger than the length.  Since values in the array at position greater than and equal to UINT32_MAX are not elements, but are properties, they are not affected.  Thus, concat that overflows should leave the length at the max length, but all of the concat'd elements/properties should still be in the array, even though some are >= the length that is set on the array.

All of the major browsers currently do something unique and interesting:

Opera 11.51 overflows n to small, does not remove elements at >= length, and does not keep (or presumably ever set) properties >= UINT32_MAX:
C.length: 1
C[len-2]: a
C[len-1]: b
C[len+0]: undefined
C[len+1]: undefined
C[len+2]: undefined

Firefox nightly overflows and truncates elements, but leaves the properties alone as expected:
C.length: 1
C[len-2]: undefined
C[len-1]: undefined
C[len+0]: c
C[len+1]: undefined
C[len+2]: undefined

Chromium 13.0.782.215 does not overflow n, but does not continue assigning properties either:
C.length: 4294967295
C[len-2]: a
C[len-1]: b
C[len+0]: undefined
C[len+1]: undefined
C[len+2]: undefined

IE 8 throws and does not perform the concat.

IE 9 has the "correct" behaviour as I have interpreted it above.
C.length: 4294967295
C[len-2]: a
C[len-1]: b
C[len+0]: c
C[len+1]: d
C[len+2]: undefined
Attachment #562919 - Attachment is obsolete: true
I also have a different patch converting array_concat to CallArgs as we discussed, but I will save that until after we get this fixed.
Attachment #562909 - Attachment is obsolete: true
Attachment #562909 - Flags: feedback?(jwalden+bmo)
Attachment #563173 - Flags: feedback?(jwalden+bmo)
Does not appear to be a security bug, unhiding.
Group: core-security
Same semantics as minimal version, but cleans up the rest of array_concat at the same time.
Attachment #563442 - Flags: feedback?(jwalden+bmo)
(In reply to Terrence Cole [:terrence] from comment #5)
> The behaviour I expect for strict conformance to the spec is:  
>  - The variable "n" in the spec should not overflow at uint32 bounds.

Per 5.2 Algorithm Conventions, all math in the spec text "should always be understood as computing exact mathematical results on mathematical real numbers".  So when the spec algorithm looks like it increments n by 1, it always increments n by 1, even past 2**32 - 1.

>  - Once it goes over uint32 size, it is no longer an "element" key, it is a
> property.

ECMAScript only has one kind of structural property.  Some property names are array indexes, true -- those are all integers in the range [0, 2**32 - 2].  (Why -2?  Because array lengths fit in a uint32_t, so for the max 2**32 - 1 length, the maximum index can be 2**32 - 2.)  But there's no difference among properties, except when some algorithm or another chooses to impose a distinction.

>  - Assignment to the "length" property "saturates" at the maximum length
> value, UINT32_MAX; e.g. A.setOwnProperty('length', MIN(n, UINT32_MAX))

Perhaps bizarrely, because all math acts on exact mathematical results, this isn't what happens: rather, the assignment really should...well, wait!  The spec algorithm doesn't even assign to the length property.  (Although defining the elements might, if |this| was an Array or a Proxy with a defineProperty hook that did so.)  So this definitely isn't the case.

>  - Assignment to "length" of a value smaller than the last element will
> remove all *elements* in the array larger than the length.  Since values in
> the array at position greater than and equal to UINT32_MAX are not elements,
> but are properties, they are not affected.  Thus, concat that overflows
> should leave the length at the max length, but all of the concat'd
> elements/properties should still be in the array, even though some are >=
> the length that is set on the array.

This is indeed the case.  (Crazy, I know.)
Comment on attachment 563173 [details] [diff] [review]
Minimal patch updating SM to the expected behaviour.

I think just changing the |length| type to jsdouble might do the trick (without the length-setting change).  That said, I'm not sure what semantics we want for a spot-fix here.  Since I guess we're not caring about this as a security matter, how about we just do a full stepwise fix instead, and forget doing a spot-fix?
Attachment #563173 - Flags: feedback?(jwalden+bmo)
Comment on attachment 563442 [details] [diff] [review]
Less minimal -- includes cleanups to array_concat.

Aside from the semantic differences from the spec algorithm, when we've been updating our implementations lately we've been following the format of /* Step 1. */ steps; /* Step 2. */ steps; etc.  (If we do steps out of order as an optimization, then the progression indicated by comments may not be exactly sequential.)  See js_fun_apply in jsfun.cpp for an example.  That's what you should be doing here when you update things.  It makes it much easier to see that the spec step and our implementation of it are isomorphic.
Attachment #563442 - Flags: feedback?(jwalden+bmo)
If we're rewriting, we should probably try to sweep up bug 609896 at the same time.  I'd say, since you're still learning here, you should do the spec steps to start in a first patch, then in a patch atop that implement dense-array optimizations.
Blocks: 609896
We missed one other edge case:  since the algorithm never explicitly writes to length and does not write back holes, concatting an array like "new Array(50)" will not update the original array length.

The new attachment is a first pass WIP, without dense array optimizations, except for the first element, which already existed and I didn't want to tear out.
Attachment #563442 - Attachment is obsolete: true
Attached patch v3 (obsolete) — Splinter Review
Applied style nits from other array reviews, rebased to current tree, and fixed the iteration constraints.  This should be getting very close.
Attachment #565294 - Attachment is obsolete: true
Attachment #570404 - Flags: review?(jwalden+bmo)
Comment on attachment 570404 [details] [diff] [review]
v3

Review of attachment 570404 [details] [diff] [review]:
-----------------------------------------------------------------

I got a little into this with comments, then I hit the main trick, and I had to stop for a second.

The code to special-case a dense-array |this| is fairly confusing here.  And it imposes some overhead on every iteration of the concat loop.  I think we could structure this code more simply using a do-while loop, and looping over this plus arguments in the CallArgs themselves.  CallArgs or CallReceiver would need to grow a |Value* thisAndArgs() const| method.  Prior to the start of the do-while you'd need to decide whether to do the special |this| handling, bumping the index into this-and-args if so.  And you'd have to loop to |i <= args.length()|.  But I think it would be clearer, and it'd keep the |this| handling out of the main body of the algorithm.

If you're not quite sure what I'm requesting here, please poke me and I can sketch it out in person.

::: js/src/jsarray.cpp
@@ +3032,5 @@
>      result->setDenseArrayLength(len);
>  }
>  #endif /* JS_METHODJIT */
>  
> +/* ES5 15.4.4.4 */

/* ES5 15.4.4.4. */ (note the period at the end)

@@ +3045,3 @@
>          return false;
>  
> +    /* Step 2 & 3. */

/* Steps 2-3. */

@@ +3066,5 @@
> +        TryReuseArrayType(obj, arr);
> +        /*
> +         * Note: thisLen is conceptually 'n' here, but would result in an
> +         * undefined cast from double to uint32 if we used it directly.
> +         */

I don't understand this comment.
Attachment #570404 - Flags: review?(jwalden+bmo) → review-
I've added dense array optimizations to this to generalize the existing ones.  V8 shows no change, but SS and Kraken both have some mild speedups.

SunSpider: 1.032x as fast    1250.7ms +/- 0.1%   1211.6ms +/- 0.0%     significant
Kraken:    1.039x as fast    63572.0ms +/- 0.0%   61179.2ms +/- 0.0%     significant

I've marked this as WIP, since Kraken has a few largish slowdowns with this change that I'd like to investigate before signing off on it.
Attachment #570404 - Attachment is obsolete: true
Please disregard those perf numbers.  I failed to set the --args properly when running these tests.
Attached patch v5: rewritten (obsolete) — Splinter Review
I totally forgot about this.

The new attachment is a simple rewrite of Array.prototype.concat into the latest spec steps. It also fixes the overflow along the way. Given that I wasn't able to show a performance improvement with the previous patch, I think we should be fine with the existing, limited optimization, which makes this pretty simple.
Attachment #697738 - Flags: review?(jwalden+bmo)
Comment on attachment 697738 [details] [diff] [review]
v5: rewritten

Review of attachment 697738 [details] [diff] [review]:
-----------------------------------------------------------------

This needs a test for the first issue, definitely.  The second one, too -- although I bet existing tests catch it, that's just too obvious to not be tested.  :-)  I don't think you can write one for the third issue or not, at least not without waiting O(2**32) time the way we're implementing this.  (Maybe once we have separate sparse element support we can make this doable, if sparse elements are represented in sorted order.)

::: js/src/jsarray.cpp
@@ +2800,5 @@
>          return false;
>  
> +    /* Step 5. */
> +    size_t nitems = args.length() + 1;
> +    const Value *items = args.thisv().address();

This is making some assumptions about how args/this are laid out -- please add a thisAndArgs() to CallArgs as I mentioned in an earlier comment that can move these assumptions to where they belong.

And, hmm.  Here's a fun testcase that reusing the CallArgs-wrapped addresses causes (because it's using the raw |this|, not the potentially-a-boxed-primitive produced by step 1:

  typeof (Array.prototype.concat).call("foo")[0]

That should be "object", but with this patch it's "string".  Somewhat interestingly IE and Opera (and us, right now, and with this patch) have this bug, while Nitro/V8 do not.

@@ +2852,5 @@
> +                JSBool exists;
> +                if (!GetElement(cx, elemObj, k, &exists, &subElement))
> +                    return false;
> +
> +                if (!exists && !SetArrayElement(cx, arr, n + k, subElement))

Erm, shouldn't that be |exists|?

@@ +2868,5 @@
> +    /* Step 9. */
> +    args.rval().setObject(*arr);
> +
> +    /* Step 7-8. */
> +    return SetLengthProperty(cx, arr, Min(n, (uint64_t)UINT32_MAX));

This isn't actually correct -- my reading of ArraySetLength in es6 is that it should throw if ToUint32(n) != ToNumber(n), which would be the case if n > UINT32_MAX.  I think you want to pass |n| directly here (or, rather, double(n) to avoid a possible-truncation warning, I think).
Attachment #697738 - Flags: review?(jwalden+bmo) → review-
Terrence, I just ran into this bug again, in the course of unrelated fixing that exposed this wrongness.  Any chance you could push this through the rest of the way?
Once more unto the breach.

I think this addresses everything from the reviews with the exception that I switched the while loop to a for loop instead of a do-while loop.

We cannot quite follow spec steps in the latest draft since we haven't yet fleshed out the concept of exotic array objects and spreadability. This does as well as we can without extensive further work.
Attachment #697738 - Attachment is obsolete: true
Attachment #809319 - Flags: review?(jwalden+bmo)
Comment on attachment 809319 [details] [diff] [review]
spec_steps_Array_prototype_concat-v6.diff

Review of attachment 809319 [details] [diff] [review]:
-----------------------------------------------------------------

concat-overflow.js looks like it's testing a bunch of ES6-ish semantics.  But as I think we should just do ES5 for now, until this bit of ES6 stabilizes, it looks like a bit of it should go away, or so.  I didn't look too closely once I saw that, so consider concat-overflow.js unreviewed.

::: js/public/CallArgs.h
@@ +338,5 @@
>      }
>  
> +    /*
> +     * Returns thisv and args in a single array, as required by
> +     * Array.prototype.concat.

I'd get rid of "as required..." and have "(that is, a pointer to |this| followed by any provided arguments)", to be slightly clearer.  It's kind of a layering violation to mention concat here, so let's not do that.

@@ +340,5 @@
> +    /*
> +     * Returns thisv and args in a single array, as required by
> +     * Array.prototype.concat.
> +     */
> +    const Value *thisAndArgs() const {

Let's return a RangedPtr<const Value> instead that includes argc_ + 1 as a bound in it.

::: js/src/jit-test/tests/arrays/concat-overflow.js
@@ +1,4 @@
> +/* Overflow on first appended item. */
> +threw = false;
> +a = new Array(4294967295);
> +b = new Array(1);

Declare all these at top, with a single |var threw, a, b;| if you don't want to hurt reading-similarity.

::: js/src/jit-test/tests/arrays/concat-use-correct-this.js
@@ +1,1 @@
> +print(typeof (Array.prototype.concat).call("foo")[0])

assertEq(Array.prototype.concat.call("foo")[0] instanceof String, true);

::: js/src/jsarray.cpp
@@ +2477,3 @@
>  /*
>   * Python-esque sequence operations.
>   */

Get rid of this comment.  It was only sort of relevant before these algorithms were added to the spec, but, well, that was last millennium at this point.  What matters is the spec location of this thing, now:

/* ES5 15.4.4.4. */

@@ +2488,3 @@
>          return false;
>  
> +    /* Step 7-8. */

I think it confuses more than it helps for this to use ES6 step-numbering, but not actually implement the ES6 algorithm.  Just use the ES5 numbering and algorithm; whenever ES6 is finalized we can redo this in that formulation.

@@ +2489,5 @@
>  
> +    /* Step 7-8. */
> +    double n = 0;
> +    size_t nitems = args.length() + 1;
> +    const Value *items = args.thisAndArgs();

RangedPtr<const Value> items

@@ +2495,5 @@
> +    /* Iterate the modified |this| and not the original. */
> +    args.setThis(ObjectValue(*obj));
> +
> +    /*
> +     * Steps 5. This may alse inline the first iteration of Step 6 if it is

step 2

@@ +2501,5 @@
> +     */
> +    Rooted<ArrayObject*> arr(cx);
> +    if (obj->is<ArrayObject>() && !obj->isIndexed()) {
> +        uint32_t initlen = obj->getDenseInitializedLength();
> +        arr = NewDenseCopiedArray(cx, initlen, obj, 0);

Hmm, this looks like a pre-existing bug: NewDenseCopiedArray does a literal copy from dense elements, but the spec algorithm says to call [[HasProperty]], then [[Get]].  A testcase like this should show the difference:

  Object.prototype[0] = "foo";
  assertEq([/* hole */, 5].concat().hasOwnProperty("0"), true);

...and indeed does.

This looks easily fixable: just make the condition |obj->is<ArrayObject>() && ObjectMayHaveExtraIndexedProperties(obj)|.  Add a test, of course.

@@ +2528,4 @@
>          if (!JS_CHECK_OPERATION_LIMIT(cx))
>              return false;
> +
> +        if (IsObjectWithClass(elem, ESClass_Array, cx)) {

/* Step 5b. */

@@ +2530,5 @@
> +
> +        if (IsObjectWithClass(elem, ESClass_Array, cx)) {
> +            elemObj = &elem.toObject();
> +
> +            uint32_t len;

/* Step 5b(ii). */

@@ +2534,5 @@
> +            uint32_t len;
> +            if (!GetLengthProperty(cx, elemObj, &len))
> +                return false;
> +
> +            for (uint32_t k = 0; k < len; ++k) {

/* Steps 5b(i), 5b(iii). */

@@ +2543,5 @@
> +                if (!JSObject::getElementIfPresent(cx, elemObj, elemObj, k, &subElement, &exists))
> +                    return false;
> +
> +                if (exists && !SetArrayElement(cx, arr, n + k, subElement))
> +                    return false;

I'm not quite convinced SetArrayElement is the right thing here.  Looking at it, it's not entirely clear *what* its semantics exactly are.  The slow-path is equivalent to [[Put]]; the fast-path is equivalent to [[DefineOwnProperty]].  And it looks like we have callers that expect each of those different behaviors, so there's not really a fix for it.

So how about this: copy SetArrayElement to a new method, DefineArrayElement, and fix up the trailing setGeneric into a defineGeneric.  Put a comment along these lines next to it, so it's exactly clear what it does:

/**
 * Calls [[DefineOwnProperty]] on the given object to create a writable,
 * enumerable, configurable data property with the given value.  Throw a
 * TypeError if the definition would create a conflict.
 */

Then use DefineArrayElement here, and file a followup to fix all the callers of SetArrayElement to something proper, and to fix SetArrayElement to whatever it should be.

Oh, and a testcase that we fail because SetArrayElement isn't the right thing here (or isn't right, one or the other, probably both):

  var bigarr = [];
  bigarr[5000] = 17;
  Object.defineProperty(Object.prototype, 5000, { set: function() { throw 42; } });
  assertEq([].concat(bigarr).length, 5001);

@@ +2548,5 @@
>              }
> +            n += len;
> +        } else {
> +            if (!SetArrayElement(cx, arr, n, elem))
> +                return false;

DefineArrayElement here, too.

@@ +2557,5 @@
> +    /* Step 12. */
> +    args.rval().setObject(*arr);
> +
> +    /* Step 10-11. */
> +    return SetLengthProperty(cx, arr, n);

This isn't in the ES5 algorithm; let's wait for ES6 to pick it up.
Attachment #809319 - Flags: review?(jwalden+bmo) → review-
(In reply to Jeff Walden [:Waldo] (remove +bmo to email) from comment #22)
> Comment on attachment 809319 [details] [diff] [review]
> spec_steps_Array_prototype_concat-v6.diff
> 
> Review of attachment 809319 [details] [diff] [review]:
> -----------------------------------------------------------------
> 
> concat-overflow.js looks like it's testing a bunch of ES6-ish semantics. 
> But as I think we should just do ES5 for now, until this bit of ES6
> stabilizes, it looks like a bit of it should go away, or so.  I didn't look
> too closely once I saw that, so consider concat-overflow.js unreviewed.

Fixing. I think I'm going to keep the getters/setters tests so that when we implement ES6 sematics some of these tests will still be useful.

> ::: js/public/CallArgs.h
> @@ +340,5 @@
> > +    /*
> > +     * Returns thisv and args in a single array, as required by
> > +     * Array.prototype.concat.
> > +     */
> > +    const Value *thisAndArgs() const {
> 
> Let's return a RangedPtr<const Value> instead that includes argc_ + 1 as a
> bound in it.

Excellent idea!
 
> ::: js/src/jit-test/tests/arrays/concat-overflow.js
> @@ +1,4 @@
> > +/* Overflow on first appended item. */
> > +threw = false;
> > +a = new Array(4294967295);
> > +b = new Array(1);
> 
> Declare all these at top, with a single |var threw, a, b;| if you don't want
> to hurt reading-similarity.

I've split these all up into their own self-contained functions, like I should have originally.
 
> ::: js/src/jit-test/tests/arrays/concat-use-correct-this.js
> @@ +1,1 @@
> > +print(typeof (Array.prototype.concat).call("foo")[0])
> 
> assertEq(Array.prototype.concat.call("foo")[0] instanceof String, true);

D'oh! Looks like I forgot to qref that change.

> ::: js/src/jsarray.cpp
> @@ +2501,5 @@
> > +     */
> > +    Rooted<ArrayObject*> arr(cx);
> > +    if (obj->is<ArrayObject>() && !obj->isIndexed()) {
> > +        uint32_t initlen = obj->getDenseInitializedLength();
> > +        arr = NewDenseCopiedArray(cx, initlen, obj, 0);
> 
> Hmm, this looks like a pre-existing bug: NewDenseCopiedArray does a literal
> copy from dense elements, but the spec algorithm says to call
> [[HasProperty]], then [[Get]].  A testcase like this should show the
> difference:
> 
>   Object.prototype[0] = "foo";
>   assertEq([/* hole */, 5].concat().hasOwnProperty("0"), true);
> 
> ...and indeed does.
> 
> This looks easily fixable: just make the condition |obj->is<ArrayObject>()
> && ObjectMayHaveExtraIndexedProperties(obj)|.  Add a test, of course.

Good catch, although I think you want a ! in there. Test added and now passing.

> @@ +2543,5 @@
> > +                if (!JSObject::getElementIfPresent(cx, elemObj, elemObj, k, &subElement, &exists))
> > +                    return false;
> > +
> > +                if (exists && !SetArrayElement(cx, arr, n + k, subElement))
> > +                    return false;

<snip>

> Then use DefineArrayElement here, and file a followup to fix all the callers
> of SetArrayElement to something proper, and to fix SetArrayElement to
> whatever it should be.

Bug 922301.
 
> Oh, and a testcase that we fail because SetArrayElement isn't the right
> thing here (or isn't right, one or the other, probably both):
> 
>   var bigarr = [];
>   bigarr[5000] = 17;
>   Object.defineProperty(Object.prototype, 5000, { set: function() { throw
> 42; } });
>   assertEq([].concat(bigarr).length, 5001);

Good catch! Added test, now passing.
Attachment #809319 - Attachment is obsolete: true
Attachment #812240 - Flags: review?(jwalden+bmo)
Comment on attachment 812240 [details] [diff] [review]
spec_steps_Array_prototype_concat-v7.diff

Review of attachment 812240 [details] [diff] [review]:
-----------------------------------------------------------------

::: js/public/CallArgs.h
@@ +342,5 @@
> +     * Returns thisv and args in a single array. (That is, a pointer to |this|
> +     * followed by any provided arguments.)
> +     */
> +    mozilla::RangedPtr<const Value> thisAndArgs() const {
> +        return mozilla::RangedPtr<const Value>(this->thisv().address(), argc_ + 1);

Use |this->argv_ - 1|, please.

::: js/src/jit-test/tests/arrays/concat-overflow.js
@@ +9,5 @@
> +    assertEq(threw, false);
> +    assertEq(c[4294967295], 'a');
> +    assertEq(c.length, 0);
> +}
> +testOverflowOneSplat();

*splat*

@@ +63,5 @@
> + */
> +var count;
> +function MakeFunky() {
> +    var o = new Array();
> +    o.__defineGetter__("0", function() { count++; return 'a'; });

Object.defineProperty(o, "0", { get: ..., enumerable: true, configurable: true, writable: true });

@@ +81,5 @@
> +    assertEq(threw, false);
> +    assertEq(c[4294967294], 'a');
> +    assertEq(c[4294967295], 'a');
> +    assertEq(c[4294967296], 'a');
> +    assertEq(count, 3);

assertEq(c.length, 4294967295);

@@ +95,5 @@
> +    } catch(e) {
> +      threw = true;
> +    }
> +    assertEq(threw, false);
> +    assertEq(c[0], 'a');

assertEq(c.length, 1);

::: js/src/jsarray.cpp
@@ +307,5 @@
>  
> +/**
> + * Calls [[DefineOwnProperty]] on the given object to create a writable,
> + * enumerable, configurable data property with the given value. Throw a
> + * TypeError if the definition would create a conflict.

Hmm.  I just realized the "throw if..." language, while true of this implementation, isn't helpful.  The method in [].concat *doesn't* "throw if...", in a certain sense.  It's specified to do nothing if there's a conflict, but the nature of the algorithm means there'll never be a conflict.

Maybe replace the last sentence with "The property must not already exist on the object." and change the name to DefineElementNoConflict (ES6 has a DefinePropertyNoConflict method, so this would be a nice parallel), to solve this?

@@ +2605,3 @@
>              return false;
> +        TryReuseArrayType(obj, arr);
> +        uint32_t len;

obj->as<ArrayObject>.length(), I think, maybe?

@@ +2616,2 @@
>              return false;
> +

Stray blank line.

@@ +2620,5 @@
> +    /* Step 5. */
> +    RootedObject elemObj(cx);
> +    RootedValue subElement(cx);
> +    for (; nitems > 0; --nitems, ++items) {
> +        HandleValue elem = HandleValue::fromMarkedLocation(&*items);

items.get()?
Attachment #812240 - Flags: review?(jwalden+bmo) → review+
(In reply to Jeff Walden [:Waldo] (remove +bmo to email) from comment #24)
> ::: js/src/jit-test/tests/arrays/concat-overflow.js
> @@ +9,5 @@
> > +    assertEq(threw, false);
> > +    assertEq(c[4294967295], 'a');
> > +    assertEq(c.length, 0);
> > +}
> > +testOverflowOneSplat();
> 
> *splat*

A pun. Internally concat is doing approximately |concat(...['a', 'b', 'c']);|.
 
> @@ +63,5 @@
> > + */
> > +var count;
> > +function MakeFunky() {
> > +    var o = new Array();
> > +    o.__defineGetter__("0", function() { count++; return 'a'; });
> 
> Object.defineProperty(o, "0", { get: ..., enumerable: true, configurable:
> true, writable: true });

Thanks! I wasn't sure which was more modern.
 
> ::: js/src/jsarray.cpp
> @@ +307,5 @@
> >  
> > +/**
> > + * Calls [[DefineOwnProperty]] on the given object to create a writable,
> > + * enumerable, configurable data property with the given value. Throw a
> > + * TypeError if the definition would create a conflict.
> 
> Hmm.  I just realized the "throw if..." language, while true of this
> implementation, isn't helpful.  The method in [].concat *doesn't* "throw
> if...", in a certain sense.  It's specified to do nothing if there's a
> conflict, but the nature of the algorithm means there'll never be a conflict.
> 
> Maybe replace the last sentence with "The property must not already exist on
> the object." and change the name to DefineElementNoConflict (ES6 has a
> DefinePropertyNoConflict method, so this would be a nice parallel), to solve
> this?

Makes sense.
 
> @@ +2605,3 @@
> >              return false;
> > +        TryReuseArrayType(obj, arr);
> > +        uint32_t len;
> 
> obj->as<ArrayObject>.length(), I think, maybe?

Oh, very nice.
 
> @@ +2620,5 @@
> > +    /* Step 5. */
> > +    RootedObject elemObj(cx);
> > +    RootedValue subElement(cx);
> > +    for (; nitems > 0; --nitems, ++items) {
> > +        HandleValue elem = HandleValue::fromMarkedLocation(&*items);
> 
> items.get()?

Returns a copy, sadly.

https://hg.mozilla.org/integration/mozilla-inbound/rev/e807ddf2965f
Comment on attachment 812240 [details] [diff] [review]
spec_steps_Array_prototype_concat-v7.diff

Review of attachment 812240 [details] [diff] [review]:
-----------------------------------------------------------------

::: js/src/jsarray.cpp
@@ +2593,5 @@
> +    /* Iterate the modified |this| and not the original. */
> +    args.setThis(ObjectValue(*obj));
> +
> +    /*
> +     * Step 2. This may alse inline the first iteration of Step 5 if it is

Nit: "alse"
Bug 299644 commentary suggests to me that the lack of length-setting in ES5 is a change from ES3 -- and probably not an intentional one, as ES6 I think adds it back.  Hmm.  We may need to do some work to figure out what the ES5 algorithm "should" be, with that seeming mistake fixed.
I concur. I should have re-run the reftests after making that change, not just the jit-tests. I'm more than a little disturbed that mochitest-oth depends on this behavior too.
Can we get this landed? Sorry for nagging but the array tests are really important for our Peacekeeper score...
Blocks: test262
See Also: → 1233642
The spec has changed from using ToUint32 to ToLength. The up-side is that index overflows now turn into properties in a sane fashion, which pretty much solves the issue that this bug was opened for. The downside is that converting an indexed array to properties for an array large enough to have an overflow is now incredibly slow, by spec. So it's hard to test that the behavior is actually "correct" now. In any case, I'm pretty sure the underlying problem is fixed, or at the very least, no longer of any practical relevance.
Status: ASSIGNED → RESOLVED
Closed: 8 years ago
Resolution: --- → WORKSFORME
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Created:
Updated:
Size: