Closed Bug 986873 Opened 11 years ago Closed 6 years ago

WebGL GL_STENCIL_BITS = 0 on Firefox/Chrome

Categories

(Core :: Graphics: CanvasWebGL, defect)

27 Branch
x86
macOS
defect
Not set
normal

Tracking

()

RESOLVED INVALID

People

(Reporter: alec, Unassigned)

Details

User Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_9_2) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/33.0.1750.146 Safari/537.36 Steps to reproduce: With WebGL context bound, the following always returns 0 for gl.STENCIL_BITS. This makes it difficult to distinguish Firefox/Chrome from IE11 which has no stencil support as of WebGL 0.93. This happens whether the browser has depth-texture extension support enabled or not. GLint capInt = 0; gl.getIntegerv(gl.STENCIL_BITS, &capInt); Actual results: Returns 0. Expected results: Return 8.
I found this thread after posting the bug, and thought my interpretation of STENCIL_BITS might be incorrect. https://bugzilla.mozilla.org/show_bug.cgi?id=648883 On further exploration, this still looks like a bug. At startup, even without allocating a default depth target, I see gl.DEPTH_BITS return 24, but gl.STENCIL_BITS returns 0. That means that gl.DEPTH_BITS isn't tied into the current FBO, and neither should gl.STENCIL_BITS. There is already gl.RENDERBUFFER_STENCIL_SIZE and one for depth, and it feels like gl.STENCIL_BITS has been mistaken for the former on Chrome/Firefox.
Here's what a native OpenGL app reports for the very same quantities (with no FBO depth and or stencil allocated), using those same constants. texDepthBits = 32 texStencilBits = 8
It looks like gl.DEPTH_BITS is also incorrect. It returns 24, when it should return 32. WebGL supports D32, and so 32 is the correct "max bits" for the depth, and not 24. If only D24S8 was supported, then 24 would be the correct value, but on my system both are supported.
From the spec, it looks like gl.DEPTH/STENCIL_BITS are tied to the currently FBO and reflect the bits used by each of it's channels. This applies to both texture and renderbuffers bound to the FBO. Where gl.RENDERBUFFER_DEPTH/STENCIL_SIZE only apply to renderbuffers. The difference in behavior that I'm seeing are when no FBO is bound on WebGL vs. Native OpenGL. I see 32/8 for my device on Native, and 24/0 for WebGL. RED/GREEN/BLUE/ALPHA would return 32 (for 32f). It feels like returning the max bits would be more helpful to callers, especially since WebGL supports D32. IE11 could then return gl.STENCIL_BITS of 0 to indicate that it still does not have Stencil support as of WebGL 0.93. This would also help with web stats tracking of device max capabilities. Apps could pull stats from a freshly initialized context, without having to resort to allocating various fbo and textures.
Component: Untriaged → Canvas: WebGL
Product: Firefox → Core

This all sounds correct.
A default webgl context is created with depth:true, stencil:false. This means that by default DEPTH_BITS will return either 24 or 32, and STENCIL_BITS must return 0. (unless created with stencil:true)
I believe we have good conformance tests for this, as well.

Thanks for the report though!

Status: UNCONFIRMED → RESOLVED
Closed: 6 years ago
Resolution: --- → INVALID
You need to log in before you can comment on or make changes to this bug.