can't get 16-bit data using WebGL2 getPixels
Categories
(Core :: Graphics: CanvasWebGL, defect)
Tracking
()
Tracking | Status | |
---|---|---|
firefox83 | --- | fixed |
People
(Reporter: ashley_c_mort, Assigned: jgilbert)
Details
Attachments
(2 files)
User Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:78.0) Gecko/20100101 Firefox/78.0
Steps to reproduce:
We have a 16-bit per channel image editor that runs on WebGL2. We create a 16-bit texImage2D then read the data back out as 16-bit. This works on Chrome but gives an error on Firefox.
WebGL warning: readPixels: Incompatible format or type.
Here is an example. Check the browser console logs when it runs. Chroms will return an array of 16-bit values which is what we'd expect and Firefox will return all 0.
This is the smallest example I could make. In our app we are actually making texImage2D as:
R16UI, RED_INTEGER, UNSIGNED_SHORT
or
RGB16UI, RGB_INTEGER, UNSIGNED_SHORT
and getPixel:
RGBA, UNSIGNED_SHORT
https://jsfiddle.net/mortac8/Lbgam4hv/22/
Actual results:
Firefox returned an array of all 0.
Expected results:
Actual 16-bit data should have been returned.
Updated•4 years ago
|
Reporter | ||
Comment 1•4 years ago
|
||
Here is a better exampe of what I want to do in a jsfiddle:
https://jsfiddle.net/cpm5fgzr/
let pix = new Uint16Array(size);
gl.readPixels(0, 0, w, h, gl.RGBA_INTEGER, gl.UNSIGNED_SHORT, pix);
WebGL warning: readPixels: Incompatible format or type.
Assignee | ||
Comment 2•4 years ago
|
||
Updated•4 years ago
|
Assignee | ||
Comment 3•4 years ago
|
||
I'm really surprised, but chrome is allowing RGBA/UNSIGNED_SHORT reads from the RGBA8 backbuffer.
So I tested it, and chrome gives back IMPLEMENTATION_COLOR_READ_FORMAT/TYPE of RGBA/UNSIGNED_BYTE, but accepts RGBA/UNSIGNED_SHORT anyway, which is out of spec.
Assignee | ||
Comment 4•4 years ago
|
||
I am proposing a test for this for the WebGL test suite: https://github.com/KhronosGroup/WebGL/pull/3158
Comment 6•4 years ago
|
||
bugherder |
Description
•