webgl renderer info reports wrong gpu
Categories
(Core :: Graphics: CanvasWebGL, defect, P3)
Tracking
()
Tracking | Status | |
---|---|---|
firefox-esr78 | --- | unaffected |
firefox-esr91 | --- | affected |
firefox93 | --- | wontfix |
firefox94 | --- | wontfix |
People
(Reporter: github, Unassigned)
References
(Regression)
Details
(Keywords: regression)
User Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/93.0.4577.63 Safari/537.36 Edg/93.0.961.38
Steps to reproduce:
Upon trying to detect the gpu from a webgl context in Firefox, the reported gpu is not the actual gpu. (In this case it's reported to be a gpu from 2013 instead from 2019)
To reproduce this, execute the following script:
const canvas = document.createElement('canvas');
let gl;
let debugInfo;
let vendor;
let renderer;
try {
gl = canvas.getContext('webgl') || canvas.getContext('experimental-webgl');
} catch (e) {
}
if (gl) {
debugInfo = gl.getExtension('WEBGL_debug_renderer_info');
vendor = gl.getParameter(debugInfo.UNMASKED_VENDOR_WEBGL);
renderer = gl.getParameter(debugInfo.UNMASKED_RENDERER_WEBGL);
}
console.log(vendor);
console.log(renderer);
Actual results:
Console output:
vendor: ATI Technologies Inc.
renderer: Radeon R9 200 Series
Expected results:
Console output (from chromium):
vendor: ATI Technologies Inc.
renderer: AMD Radeon Pro 5500M OpenGL Engine
Comment 1•3 years ago
|
||
The Bugbug bot thinks this bug should belong to the 'Core::Canvas: WebGL' component, and is moving the bug to that component. Please revert this change in case you think the bot is wrong.
Output from another device:
Chromium: ANGLE (NVIDIA Corporation, Quadro T2000 with Max-Q Design/PCIe/SSE2, OpenGL 4.5.0 NVIDIA 460.91.03)
Firefox (92): GeForce GTX 980/PCIe/SSE2
Comment 3•3 years ago
|
||
This can be observed at https://webglreport.com/ as well.
Chrome: ANGLE (NVIDIA, NVIDIA GeForce GTX 1070 Ti Direct3D11 vs_5_0 ps_5_0, D3D11-30.0.14.7196)
Firefox: ANGLE (NVIDIA GeForce GTX 980 Direct3D11 vs_5_0 ps_5_0)
Comment 4•3 years ago
|
||
mozregression run:
Bug 1715690 - Generalize WebGL RENDERER into large buckets. r=lsalzman
- Minor reduction in unused flexibility of limits.
Differential Revision: https://phabricator.services.mozilla.com/D117385
Updated•3 years ago
|
Comment 6•3 years ago
|
||
This was a deliberate anti-fingerprinting change in bug 1715690.
Do you need to identify a user's GPU more precisely than this?
Comment 7•3 years ago
|
||
Worth noting that about:support continues to show the actual GPU info, for debugging purposes!
Updated•3 years ago
|
It is useful to know (at least roughly) in a webgl context what gpu the user is running.
Use-case: allocating (video) memory based on the hardware.
With the information for an ATI gpu this is quite tricky if not impossible to do, since there is a huge variety of gpus in the 'Radeon R9 200 Series'.
The actual gpu in question features 8GB of video memory, the R9 200 Series has anything from 1GB to 8GB, which is currently unknown in the webgl context.
Comment 9•3 years ago
|
||
Do you currently make different choices based on the GPU device we tell you about?
Reporter | ||
Comment 10•3 years ago
|
||
Yes, that's why this bug has been reported, because the information that is given by Firefox for certain GPUs is inaccurate or false and the application (webpage) could perform better/handle more if being told what the underlying hardware is capable of.
Comment 11•3 years ago
|
||
Apologies if I'm hijacking. Could perhaps the sanitizing be tied to the Fingerprinters tracking protection option under Settings -> Privacy & Security? I've noticed that Renderer is still sanitized even when that website doing the query is added as an exception (i.e. enhanced tracking protection disabled). A cursory glance at the code makes me guess it's hard-coded sanitizing - not tied to any setting or pref (for Renderer).
Currently the Unmasked Renderer can be revealed using the pref webgl.sanitize-unmasked-renderer=false.
Comment 12•3 years ago
|
||
(In reply to nopor from comment #10)
Yes, that's why this bug has been reported, because the information that is given by Firefox for certain GPUs is inaccurate or false and the application (webpage) could perform better/handle more if being told what the underlying hardware is capable of.
Part of the reasoning behind this change is that, in practice, websites don't make perf decisions based on RENDERER string, particularly since there are so many different possibilities. While there's always theoretical use cases, I particularly want to know about cases where people today actually rely on this data. Is there something you currently rely on here, or was this just a very surprising change to you?
Comment 13•3 years ago
|
||
(In reply to Gareth Perks [pexxie] from comment #11)
Apologies if I'm hijacking. Could perhaps the sanitizing be tied to the Fingerprinters tracking protection option under Settings -> Privacy & Security? I've noticed that Renderer is still sanitized even when that website doing the query is added as an exception (i.e. enhanced tracking protection disabled). A cursory glance at the code makes me guess it's hard-coded sanitizing - not tied to any setting or pref (for Renderer).
Currently the Unmasked Renderer can be revealed using the pref webgl.sanitize-unmasked-renderer=false.
Historically, RENDERER was "Mozilla" for us, and similar in other browsers. It was the advent of having renderer sanitization that allowed us to change from the dummy value "Mozilla" to the useful-but-now-safer sanitized renderer string.
I don't expect to return the unmasked renderer to RENDERER for now.
Reporter | ||
Comment 14•3 years ago
|
||
(In reply to Jeff Gilbert [:jgilbert] from comment #12)
(In reply to nopor from comment #10)
Yes, that's why this bug has been reported, because the information that is given by Firefox for certain GPUs is inaccurate or false and the application (webpage) could perform better/handle more if being told what the underlying hardware is capable of.
Part of the reasoning behind this change is that, in practice, websites don't make perf decisions based on RENDERER string, particularly since there are so many different possibilities. While there's always theoretical use cases, I particularly want to know about cases where people today actually rely on this data. Is there something you currently rely on here, or was this just a very surprising change to you?
Yes, we are relying on the unmasked renderer string to determine how much memory we can safely allocate without crashing the application. The actual memory would help us as well, though that isn't exposed in any browser anyway.
The surprise here is/was that the unmasked renderer string is so different compared to other browsers. Defaulting to Radeon R9 200 Series
in most cases is like I previously stated very inaccurate and doesn't help in determining what the system is capable of, especially since the memory of the GPUs in that series varies from 1GB to 8GB.
So in theory we could only 'safely' assume that allocating about 512-768MB is okay.
Whereas with an Nvidia GPU, which is defaulting for most of the newer GPUs to a 980 (so at least 4GB of memory), the 'brackets' are defined in a better way, because they have a smallest denominator an application can work with. The AMD GPUs in that case do not.
Comment 15•3 years ago
|
||
Set release status flags based on info from the regressing bug 1715690
Comment 16•3 years ago
|
||
The severity field is not set for this bug.
:jgilbert, could you have a look please?
For more information, please visit auto_nag documentation.
Updated•3 years ago
|
Updated•3 years ago
|
Updated•3 years ago
|
Comment 17•3 years ago
|
||
(In reply to nopor from comment #14)
Yes, we are relying on the unmasked renderer string to determine how much memory we can safely allocate without crashing the application. The actual memory would help us as well, though that isn't exposed in any browser anyway.
The surprise here is/was that the unmasked renderer string is so different compared to other browsers. Defaulting toRadeon R9 200 Series
in most cases is like I previously stated very inaccurate and doesn't help in determining what the system is capable of, especially since the memory of the GPUs in that series varies from 1GB to 8GB.
So in theory we could only 'safely' assume that allocating about 512-768MB is okay.
And what if the user indeed has Radeon R9 200 Series which according to you is unreliable to guess about memory size? This string was masked because it was assumed to be unnecessary or unreliable to be useful and you made a good proof for that therefore the proper solution may be introducing reliable metric instead of sticking to something known to be flawed.
Comment 18•2 years ago
|
||
The bug has a release status flag that shows some version of Firefox is affected, thus it will be considered confirmed.
Description
•