Closed Bug 1336562 Opened 9 years ago Closed 6 years ago

readPixels fails with RGB buffer after webgl2 refactor

Categories

(Core :: Graphics: CanvasWebGL, defect, P3)

51 Branch
Unspecified
Linux
defect

Tracking

()

RESOLVED INVALID
Tracking Status
firefox51 --- affected
firefox54 --- affected

People

(Reporter: mhirsch, Unassigned)

Details

(Whiteboard: [gfx-noted])

Attachments

(1 file)

Attached file rgb_readPixels.html
User Agent: Mozilla/5.0 (X11; Fedora; Linux x86_64; rv:51.0) Gecko/20100101 Firefox/51.0 Build ID: 20170126153126 Steps to reproduce: In Firefox 50, calling gl.readPixels with the type argument set to gl.RGB using a framebuffer object with a gl.RGB internal format texture attached to one of the color attachments would correctly fill the ArrayBuffer supplied to readPixels with RGB data. I have attached an example webgl script that loads a texture, renders it the screen, then renders it to an FBO, then calls readPixels on the FBO and prints the result to the console. Actual results: In Firefox 51, I get the error message "Error: WebGL: readPixels: Incompatible format or type." The ArrayBuffer is filled with zeros. Note that in both Firefox 50 and 51, the attached code works as expected when the internal format of the texture backing the FBO is set to gl.RGBA, and the readPixels command asks for a gl.RGBA buffer. Expected results: In Firefox 50 I would get this message: "Error: WebGL: texImage2D: Chosen format/type incurred an expensive reformat: 0x1907/0x1401", however, the ArrayBuffer would contain the correct RGB formatted data. I'm not sure what behavior to expect. I looked at the documentation: https://developer.mozilla.org/en-US/docs/Web/API/WebGLRenderingContext/readPixels Though it doesn't explicitly say that this case should work, it doesn't say it *shouldn't* work, and it seems a very reasonable thing to want to do. No matter how "expensive" it was for the GPU to do a buffer conversion (or even if it's done in c++), won't it be more expensive to do such a conversion in javascript on the client side, in cases where I need RGB data? I assume this outcome is related to the refactor for wegl2: https://hg.mozilla.org/releases/mozilla-aurora/rev/2d2153ad43c9
OS: Unspecified → Linux
Tested on Ubuntu 16.04 and I can reproduce the error message: "Error: WebGL: readPixels: Incompatible format or type." and the ArrayBuffer is filled with zeros. I tested with FF 51 release and FF Nighlty 54.0a1(2017-02-07).
Status: UNCONFIRMED → NEW
Component: Untriaged → Canvas: WebGL
Ever confirmed: true
Product: Firefox → Core
Priority: -- → P3
Whiteboard: [gfx-noted]
Status: NEW → RESOLVED
Closed: 6 years ago
Resolution: --- → FIXED
Resolution: FIXED → WORKSFORME

What are you testing on? On firefox 67 I still get the "Error: WebGL warning: readPixels: Incompatible format or type" message running the attached example.

Status: RESOLVED → REOPENED
Resolution: WORKSFORME → ---

Unfortunately, RGB/UNSIGNED_BYTE is not guaranteed to be an acceptable format and type combo.
For non-float textures the only acceptable formats are:

  • RGBA/UNSIGNED_BYTE
  • getParameter(IMPLEMENTATION_COLOR_READ_FORMAT)/getParameter(IMPLEMENTATION_COLOR_READ_TYPE)
    • (IMPLEMENTATION_COLOR_READ_FORMAT/TYPE might be give you RGBA/UNSIGNED_BYTE anyways)

It looks like this testcase relies on RGB/UNSIGNED_BYTE being valid, but that's only true for some drivers, and may change with updates to either Firefox or graphics drivers.

Status: REOPENED → RESOLVED
Closed: 6 years ago6 years ago
Resolution: --- → INVALID

We see a similar issue using WebGL2.

We have a 16-bit per channel image editing app that runs on Chrome and we would like to add support for Firefox. However, in Firefox we can't get 16-bit R, RGB, RGBA (R16UI, RGB16UI, RGBA16UI) values from our context.

gl.readPixels(0, 0, width, height, gl.RGBA, gl.UNSIGNED_SHORT, arrayBufView);
WebGL warning: readPixels: Incompatible format or type.

gl.readPixels(0, 0, width, height, gl.RGB, gl.UNSIGNED_SHORT, arrayBufView);
WebGL warning: readPixels: Incompatible format or type.

We would like to run our app in Firefox but Chrome seems to have much more flexibility in 'readPixels'.

What do these show on this machine for this FBO?

getParameter(IMPLEMENTATION_COLOR_READ_FORMAT)
getParameter(IMPLEMENTATION_COLOR_READ_TYPE)
Flags: needinfo?(ashley_c_mort)

gl.getParameter(gl.IMPLEMENTATION_COLOR_READ_FORMAT)
6408

gl.getParameter(gl.IMPLEMENTATION_COLOR_READ_TYPE)
5121

Flags: needinfo?(ashley_c_mort)

I have a made a jsfiddle to help us: https://jsfiddle.net/prkxz4q7/2/

I get the following:

R8 -> RED / UNSIGNED_BYTE
RG8 -> RG / UNSIGNED_BYTE
RGB8 -> RGBA / UNSIGNED_BYTE
RGBA8 -> RGBA / UNSIGNED_BYTE
R16UI -> RED_INTEGER / UNSIGNED_SHORT
RG16UI -> RG_INTEGER / UNSIGNED_SHORT
RGB16UI -> - / -
RGBA16UI -> RGBA_INTEGER / UNSIGNED_SHORT

Is that what you get?

Flags: needinfo?(ashley_c_mort)

Yes I get the same in Firefox and Chrome on that jsfiddle. I can make a better jsfiddle later if that would help. I opened a bug a couple days ago (https://bugzilla.mozilla.org/show_bug.cgi?id=1663214) and made a very quick jsfiddle example but I can make a better/more specific one if that would help.

rb_format -> IMPLEMENTATION_COLOR_READ_FORMAT / _TYPE
R8 -> RED / UNSIGNED_BYTE
RG8 -> RG / UNSIGNED_BYTE
RGB8 -> RGBA / UNSIGNED_BYTE
RGBA8 -> RGBA / UNSIGNED_BYTE
R16UI -> RED_INTEGER / UNSIGNED_SHORT
RG16UI -> RG_INTEGER / UNSIGNED_SHORT
RGB16UI -> - / -
RGBA16UI -> RGBA_INTEGER / UNSIGNED_SHORT

Flags: needinfo?(ashley_c_mort)

Base on that output, shouldn't I be able to do a readPixels like?
let pix = new Uint16Array(size);
gl.readPixels(0, 0, w, h, gl.RGBA_INTEGER, gl.UNSIGNED_SHORT, pix);

https://jsfiddle.net/cpm5fgzr/
WebGL warning: readPixels: Incompatible format or type.

You need to log in before you can comment on or make changes to this bug.

Attachment

General

Creator:
Created:
Updated:
Size: