Closed Bug 621068 Opened 11 years ago Closed 11 years ago
Assertion failure: *(uint64*)&tm->storage->global()[global
Slots] == 0xdeadbeefdeadbeef LL
930 bytes, application/x-compressed-tar
42.85 KB, text/plain
2.85 KB, patch
|Details | Diff | Splinter Review|
7.67 KB, text/plain
1.66 KB, patch
|Details | Diff | Splinter Review|
I have a set of files that produces the assertion Assertion failure: *(uint64*)&tm->storage->global()[globalSlots] == 0xdeadbeefdeadbeefLL, at jstracer.cpp:6524 in a shell debug build of mozilla-central trunk (parent: 59552:49a1b2aa43c5 tip). Unfortunately, I haven't been able to combine the test into a single file, it's highly fragile and consists of 7 files in total (It took me over a day to get it into it's current state). They use the load() shell function as well as gczeal(), so it might be that they produce a problem that might be invisible in the browser, I cannot tell. To reproduce, unpack the attached archive, change into the directory and run "js -f min.js" which should trigger the assertion immediately. I've locked this for now as security related, because Andreas Gal suggested that this *might* be an exploitable bug. Unlock if it is confirmed to be harmless.
Unpack this archive, change into the directory and run js -f min.js to reproduce bug.
config.status file taken from js/src/ directory
I added my config.status file, as this bug seems to behave differently depending on build options (my fuzzer has reproduced it for example with and without "--enable-valgrind", but the tests do not seem to be interchangeable). I will try to reproduce this tomorrow on a 32-bit build (repo is still being fetched).
Minusing pending security analysis: please renominate if this becomes sg:crit.
blocking2.0: ? → -
I tried to reproduce this on 32-bit and it failed so far. Either the bug is too fragile and the test would need modification on 32-bit, or it's some sort of 64-bit bug. If you need any additional information (core dump or other debugging information), let me know.
I'm loathe to mark a bug with this pattern non-critical. It seems to hard to prove that it isn't a security problem, although it may have been transient and no longer reproducible.
blocking2.0: .x → ?
If it's helpful, I can still reproduce this assertion on rev 59552 of mc trunk. My fuzzer hits it a few times per day. I'll probably switch to tm tip today, if it appears again then I'll comment here. I guess the problem is that the testcases are highly fragile across architecture and compiler flags.
sg:critical blocks, although it pains me to gate release on something (apparently) so hard to reproduce.
blocking2.0: ? → betaN+
Whiteboard: [sg:critical] → [sg:critical][hardblocker]
I just started my fuzzer on TM trunk and hit the same assertion again. At least for the specified older mc revision I can reliably reproduce this on my test server and I am willing to provide access there or provide other information/traces if it helps :)
(In reply to comment #9) > At least for the specified older mc revision I can reliably reproduce this on my test server I correct myself, the test doesn't only work on the older revisions but also on most current TM trunk (the test attached here as a tgz). It might require 64 bit though.
Attachment #499421 - Attachment mime type: application/octet-stream → text/plain
I took a quick look at this. It reproduced easily on 64-bit and it's very debuggable. I'm still not sure exactly what's going on but this patch shows the proximate cause. See next comment.
This is the interesting part of the debugging patch output, annotated. // Calling a trace. We created the deadbeef barrier at the end of the global slot array, according to the current number of slots in the global obj. create deadbeef barrier at global slot 234 at min.js:47 // Here we visit some slots. If these are out of range, that's bad. If one is equal to the deadbeef barrier position, we'll assert on the check. In this case, everything is fine. visitGlobalSlot 186 visitGlobalSlot 188 visitGlobalSlot 187 visitGlobalSlot 189 // Done running trace. Check the barrier. In this case, we pass. checking deadbeef barrier at global slot 234 // run another trace. this time we have 221 global slots. create deadbeef barrier at global slot 221 at min.js:47 checking deadbeef barrier at global slot 221 // same as last time. create deadbeef barrier at global slot 221 at min.js:44 checking deadbeef barrier at global slot 221 // now run a trace, but this time we'll do an inner tree as well. create deadbeef barrier at global slot 221 at min.js:47 visitGlobalSlot 186 visitGlobalSlot 188 visitGlobalSlot 187 visitGlobalSlot 189 // the inner tree imports here. Now the global obj has 442 slots. And we import slot 221. This causes us to overwrite the deadbeef fence above, and we'll assert when we come out of these trees. import global slot 221 (globalObj=0x101803048 globalObj->numSlots()=442 visitGlobalSlot 221 import global slot 189 (globalObj=0x101803048 globalObj->numSlots()=442 // run the inner tree and crash create deadbeef barrier at global slot 442 at e.js:1 checking deadbeef barrier at global slot 442 visitGlobalSlot 221 visitGlobalSlot 189 checking deadbeef barrier at global slot 221 Assertion failure: *(uint64*)&tm->storage->global()[globalSlots] == 0xdeadbeefdeadbeefLL, at ../jstracer.cpp:6582
So, in running the tree at min.js:47, before we get to the tree in e.js, we increase the size of the global object, and then import one of those. It looks like by design we're not supposed to be able to import a slot in an inner tree that has a higher number than what the outer tree saw. I added more instrumentation and found that the global object's shape (note it is the same identity in all cases, as the output above shows) is 16 when we create the fence at slot 221, but the shape is 758 when we have 442 slots. Are we just failing to detect a global shape change (which throws us off trace, right)?
Thanks for looking into this, dmandelin! I tried to reproduce it yesterday on my Linux box and was unable to. Sounds like you've got further than I would have been able to anyway.
Yeah, it repros easily on my Mac for whatever reason. Anyway, dvander says it's just a partial assert botch, so stealing. Patch soon.
Assignee: nnethercote → dmandelin
Attachment #503379 - Flags: review?(dvander)
11 years ago
Attachment #503379 - Flags: review?(dvander) → review+
Brief note on the bug: the assertion should not run when there is a deep bail. So it was a bogus assert, and not a security flaw. I had to debug it to find that out, so might as well fix it.
Status: NEW → ASSIGNED
blocking2.0: betaN+ → .x
Status: ASSIGNED → RESOLVED
Closed: 11 years ago
Resolution: --- → FIXED
Filter on qa-project-auto-change: Bug in removed tracer code, setting in-testsuite- flag.
You need to log in before you can comment on or make changes to this bug.