Bug 1010527 Comment 50 Edit History

Note: The actual edited comment in the bug view page will always show the original commenter’s name and original timestamp.

(In reply to Clemens Eisserer from comment #49)
> To be honest, I don't understand the issue - WebRender is OpenGL, WebGL too - why is it required to do a CPU readback at all, when a VRAM -> VRAM blit should do it?

WebGL is run sandboxed and thus needs a texture sharing method with WR to allow zero-copy compositing. It's basically the same problem as hardware accelerated video decoding.

There is a method on traditional X11 which allows this, but it used to be too buggy, mostly because of driver issues (bug 942302) and was thus never enabled by default. AFAIK Chrome does use it, but they also maintain a long list of driver bug workarounds - too much to handle for the FF team.

Last year texture sharing via DMABUF was finally implemented, a modern technology making it finally feasible to enable things by default. However, DMABUF sharing is very hard on GLX, which is why it's only implemented for the EGL backend. EGL is used on Wayland and we're also moving over the X11 backend to use it - see bug 1677203

So while we enable things step-by-step by default, current you need the following options:
 - Webrender enabled (`gfx.webrender.all`)
 - Use either Wayland and X11/EGL backend (enabled by env var `MOZ_ENABLE_WAYLAND=1` or `MOZ_X11_EGL=1`)
 - make sure `widget.dmabuf-webgl.enabled` is enabled (should be)

I hope we'll be able to role all of these things out over the next couple of releases.

Further more, the proprietary NVIDIA driver does not yet implement DMABUF - but there appears to be an internal driver version which does, their devs already open MRs based on it (https://gitlab.freedesktop.org/xorg/xserver/-/merge_requests/587). So they apparently plan to role it out soonish.
(In reply to Clemens Eisserer from comment #49)
> To be honest, I don't understand the issue - WebRender is OpenGL, WebGL too - why is it required to do a CPU readback at all, when a VRAM -> VRAM blit should do it?

WebGL is run sandboxed and thus needs a texture sharing method with WR to allow zero-copy compositing. It's basically the same problem as hardware accelerated video decoding.

There is a method on traditional X11 which allows this, but it used to be too buggy, mostly because of driver issues (bug 942302) and was thus never enabled by default. AFAIK Chrome does use it, but they also maintain a long list of driver bug workarounds - too much to handle for the FF team.

Last year texture sharing via DMABUF was finally implemented, a modern technology making it finally feasible to enable things by default. However, DMABUF sharing is very hard on GLX, which is why it's only implemented for the EGL backend. EGL is used on Wayland and we're also moving over the X11 backend to use it - see bug 1677203

So while we enable things step-by-step by default, current you need the following options:
 - Webrender enabled (`gfx.webrender.all`)
 - Use either the Wayland or X11/EGL backend (enabled by env vars `MOZ_ENABLE_WAYLAND=1` or `MOZ_X11_EGL=1`)
 - make sure `widget.dmabuf-webgl.enabled` is enabled (should be)

I hope we'll be able to role all of these things out over the next couple of releases.

Further more, the proprietary NVIDIA driver does not yet implement DMABUF - but there appears to be an internal driver version which does, their devs already open MRs based on it (https://gitlab.freedesktop.org/xorg/xserver/-/merge_requests/587). So they apparently plan to role it out soonish.

Back to Bug 1010527 Comment 50