Closed Bug 551771 Opened 13 years ago Closed 13 years ago
GLBoolean args to Web
GL calls always seen as true when called in a loop
User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.3a3pre) Gecko/20100310 Minefield/3.7a3pre (.NET CLR 3.5.30729) Build Identifier: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.3a3pre) Gecko/20100310 Minefield/3.7a3pre (.NET CLR 3.5.30729) when called in a loop, vertexAttribPointer always normalizes the data, colorMask enables all channels. when called outside loop, both work as expected Reproducible: Always Steps to Reproduce: 1. view test case 2. toggle checkboxes Actual Results: with loop not checked, red-black gradient drawn when normalize is checked, purple-red when normalize unchecked with loop checked, yellow-green gradient with normalize checked or unchecked Expected Results: should always show the red-black gradient with normalize checked, and purple-red with it unchecked loop checkbox should have no visible effect
with loop checked, correct results are shown for short periods of time during various operations: loading pages in other tabs, opening menus/dialogs (particularly for the first time), etc.
This sounds like a failure in the quickstubs for these methods; will look into it, thanks!
Assignee: nobody → vladimir
Status: UNCONFIRMED → NEW
Ever confirmed: true
See, only only special thing about GLboolean parameters is they get casted from uint32 to uint8. No idea how this can give a bug that only appears with TraceMonkey!!
When called from trace, the function that's called is nsICanvasRenderingContextWebGL_VertexAttribPointer_tn, not the one in comment 4. And if you look at that code, it has |jsval _arg3| in the function decl but: PRUint8 arg3 = (PRUint8) _arg3; arg3 is the normalize arg, declared as GLboolean in the idl. Looks like we have |typedef octet GLboolean| in that same idl. That cast to PRUint8 comes from the 'octet' case in traceableArgumentConversionTemplates in qsgen.py. That was added in bug 538258. The declaration of the argument as jsval comes from the fact that neither traceTypeMap nor traceReturnTypeMap has an 'octet' case, as far as I can tell. So what happens is that you tell tracemonkey you want a jsval (because the traceable native typemap that qsgen.py generates here uses 'jsval' for 'octet'), and it hands you one. |false| in JS becomes JSVAL_FALSE, which is a nonzero number. This is then just straight-cast to an integer, so you get a nonzero integer. Then you get normalization. Now the good news is that once jsval becomes a struct in the near future I would think that cast would fail to compile... In any case, qsgen.py needs to be fixed to either generate some other typemap for 'octet' or to fix the conversion behavior.
Thank you for exploring a lot more than I could have! Fun that you found a different function, I got mine from GDB by setting a breakpoint in the VertexAttribPointer function in WebGLContextGL.cpp:2454. Anyway we find something very similar, a uint32 to uint8 conversion.
> I got mine from GDB by setting a breakpoint in the VertexAttribPointer function > in WebGLContextGL.cpp:2454 On which iteration of the loop? The fastnative will be called on the iterations that aren't on trace yet; the traceable native will be called on trace. > Anyway we find something very similar, a uint32 to uint8 conversion. Not sure what you mean.
> On which iteration of the loop? oh OK, I hit my breakpoint upon the first loading of the test page, so not from the loop. So I guess it is irrelevant... > Not sure what you mean. The fact that you're asking means that I was talking nonsense about stuff that I don't know :-) Sorry!
So the thing to do would be to keep iterating through in gdb until the normalize parameter becomes incorrect (becomes false when it should be true, in this case) and then look at the stack trace and walk back up to see what went wrong.
Actually, looking at the tn: nsICanvasRenderingContextWebGL_VertexAttribPointer_tn(JSContext *cx, JSObject *obj, uint32 _arg0, int32 _arg1, uint32 _arg2, jsval _arg3, uint32 _arg4, uint32 _arg5) ... PRUint8 arg3 = (PRUint8) _arg3; Note that _arg3 is declared as a *jsval* in the prototype. I bet it's being passed in as a jsval, too. Though then I'd be surprised that normalize could ever be 0, because JSVAL_FALSE is (0<<3)|6=6 and JSVAL_TRUE is (1<<3)|6=13, and it looks like the WebGL code passes that normalize flag straight into gl... so it's getting values "6" and "13", and who knows what's happening there. (That latter bit is a bug, we should ensure that boolean values are really coerced to GL_TRUE or GL_FALSE.)
vlad, you did read comment 6, right? ;)
No, I didn't; managed to scroll right by it. But now I did via my email! Ok, so maybe benoit should punt on this and I can talk to jorendroff to figure out who should own it (might be me).
The fix for this is pretty straightforward, though there are actually two fixes here. Either one would be sufficient to fix this bug, but they're both correct and will prevent any problems with octet in future quickstubs.
Attachment #447937 - Flags: review?
13 years ago
Attachment #447937 - Flags: review? → review?(jorendorff)
Attachment #447937 - Flags: review?(jorendorff) → review+
Status: NEW → RESOLVED
Closed: 13 years ago
Resolution: --- → FIXED
Well, the v0 patch would have worked fine, except for a GLboolean conflict on OSX. Using the GL types is silly anyway, so this is the same as the previous patch, except it also renames all of the GL type names to WebGL* in the idl and the implementation to avoid conflicts with OSX.
Attachment #448093 - Flags: review?(joe)
Attachment #448093 - Flags: review?(joe) → review+
v1 checked in.
You need to log in before you can comment on or make changes to this bug.