Closed Bug 1202387 Opened 9 years ago Closed 5 years ago

WebGL depth buffer ends up being 16-bit instead of 24-bit when passing { antialias: false }

Categories

(Core :: Graphics: CanvasWebGL, defect, P3)

40 Branch
defect

Tracking

()

RESOLVED WORKSFORME

People

(Reporter: eliseegw, Unassigned)

References

Details

(Keywords: regression, testcase, Whiteboard: [gfx-noted])

Attachments

(1 file)

User Agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.85 Safari/537.36

Steps to reproduce:

Initialize a WebGL rendering context with `canvas.getContext("webgl", { antialias: false });`.

Look at depth buffer size with `gl.getParameter(gl.DEPTH_BITS);` right away and then later.

See attached test case. The { antialias: false } bit is important to trigger the bug.


Actual results:

Looking at the depth buffer size right away with `gl.getParameter(gl.DEPTH_BITS)`, notice on desktop platforms it's (almost always) 24-bit.

After a very short time (100ms isn't always enough but 1000ms is always enough on my machine), calling `gl.getParameter(gl.DEPTH_BITS)` again returns 16.

The underlying depth buffer looks like it is indeed 16-bit considering I'm getting terrible z-fighting issues in Firefox that I don't get in Chrome (which returns 24 always).


Expected results:

`gl.getParameter(gl.DEPTH_BITS)` should not change its return value. Older versions of Firefox used to return 24 all the time and actually provided a 24-bit depth buffer with { antialias: false }.

I'm not sure when the regression started but it might be Firefox 40. Happens in latest Nightly too - 43.0a1 (2015-09-06).

Are there portable ZIPs of older releases around so I can test with Fx 39 or 38?
Regression range:
https://hg.mozilla.org/integration/mozilla-inbound/pushloghtml?fromchange=292fbdb78dde&tochange=79637e9bcdaa

I can only reproduce the issue with FF42+.

What's your GPU,  eliseegw?
Angle with D3D11 is probably enabled on your machine.
Blocks: 1183341
Component: Untriaged → Canvas: WebGL
Flags: needinfo?(eliseegw)
Keywords: regression, testcase
Product: Firefox → Core
Flags: needinfo?(jmuizelaar)
I've got an ATI Radeon HD 4800 but I know the issue also happens on a colleague's laptop with an NVIDIA GeForce 9600M GS.

I asked around on Twitter and got 6 people with the same issue out of 10 replies: https://twitter.com/elisee/status/640849626938122240
Flags: needinfo?(eliseegw)
Bug 1191042 may help here.
Flags: needinfo?(jmuizelaar)
(In reply to Jeff Muizelaar [:jrmuizel] from comment #3)
> Bug 1191042 may help here.

It does not seem to. It looks like ANGLE doesn't support OES_depth24 and so we only ask it for 16 bit depth formats. It does support DEPTH24_STENCIL8 though, so we end up getting 24 bit depth some of the time.
> It looks like ANGLE doesn't support OES_depth24 and so we only ask it for 16 bit depth formats.

I don't know how similar the ANGLE versions shipped with Firefox and Chrome are but for what it's worth, I do get a 24-bit depth buffer on Chrome on the same computer.
> It does support DEPTH24_STENCIL8 though, so we end up getting 24 bit depth some of the time.

Can confirm that passing `{ antialias: false, stencil: true }` always returns a 24-bit depth buffer on my machine.
Whiteboard: [gfx-noted]
Reproducible (webgl-depth-16-test-case.html)
Version 	47.0.1
Build ID 	20160623154057
User Agent 	Mozilla/5.0 (Windows NT 10.0; WOW64; rv:47.0) Gecko/20100101 Firefox/47.0
Firefox 47.0.1 - Displays 
Those two numbers should be the same: 16 and (wait for it) 16
While 
Chrome - Displays 
Those two numbers should be the same: 24 and (wait for it) 24
Status: UNCONFIRMED → NEW
Ever confirmed: true
I found this bug after already creating my own test-case. I get always 16 bit depth buffer with Windows (from the start, no 24-bit ever). Also not depending on antialias: false here. Happens with Firefox 56.0.1 on Windows 10 with GeForce GTX 550 TI. It does _not_ happen on Linux on same computer with Firefox, there I get 24 bit depth buffer. Also getting 24 bit with other browsers (Chrome, Edge) on Windows and Linux (Chromium).

Sorry, I can't figure out how to attach files on comments here, so I'm posting my whole test-case:

<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01//EN"
"http://www.w3.org/TR/html4/strict.dtd">
<html>
<head> 
	<title>Testing WebGL1</title> 
	<meta http-equiv="content-type" content="text/html; charset=UTF-8">
	<meta http-equiv="cache-control" content="max-age=0" />	
	<meta http-equiv="cache-control" content="no-cache">
	<meta http-equiv="expires" content="0">
	<meta http-equiv="expires" content="Tue, 01 Jan 1980 1:00:00 GMT" />
	<meta http-equiv="pragma" content="no-cache">
</head>
<body>
	<canvas id = "gl" width = "640" height = "480"></canvas>
		
<script language = "javascript">
	window.onload = InitializeWebGL;

	// Get WebGL context, if standard is not available; fall back on alternatives        
	function GetWebGLContext( canvas )	{
		// Experimenting with different attributes, but found none that 
		// let Firefox on Windows use a larger depth buffer.
		var attributes = {  };
		//var attributes = { alpha: false, antialias: false, stencil: false, depth: true, premultipliedAlpha: false };
		return canvas.getContext("webgl", attributes) ||               
			canvas.getContext("experimental-webgl", attributes) ||   
			canvas.getContext("moz-webgl", attributes) ||            
			canvas.getContext("webkit-3d", attributes);              
	}

	function InitializeWebGL() {       
	    var canvas = document.getElementById("gl"); 
          
	    if (!!window.WebGLRenderingContext == true) { 
			var gl = GetWebGLContext( canvas );
			if ( gl ) {
				console.log("WebGL is initialized.");

				// Ensure OpenGL viewport is resized to match canvas dimensions
				gl.viewportWidth = canvas.width;
				gl.viewportHeight = canvas.height;

				console.log( gl );
				console.dir(gl.getContextAttributes()); 
				console.info('Depth buffer bits: %s', gl.getParameter(gl.DEPTH_BITS)); 
				console.info('Stencil bits: %s', gl.getParameter(gl.STENCIL_BITS)); 

				gl.clearColor(0.5, 0.5, 0.5, 1.0);
				gl.clear(gl.COLOR_BUFFER_BIT);
				gl.viewport(0, 0, gl.viewportWidth, gl.viewportHeight);
			}
			else
				console.log("Your browser doesn't support WebGL.");
		}
		else
			console.log("WebGL is supported, but disabled");	
	}
            
</script>					
</body>
</html>

I believe this was fixed long ago, but please let me know if it's still broken for you!

Status: NEW → RESOLVED
Closed: 5 years ago
Resolution: --- → FIXED
Resolution: FIXED → WORKSFORME

I've tested it with the same system as back then and it now works correct (Firefox 67.0.1). So yes, looks like it got fixed at some point, thanks!

Thank you for checking!

You need to log in before you can comment on or make changes to this bug.

Attachment

General

Creator:
Created:
Updated:
Size: