Open Bug 1339842 Opened 8 years ago Updated 2 years ago

[webvr] WebVR and WebGL should force to discrete GPU

Categories

(Core :: WebVR, defect, P5)

defect

Tracking

()

Tracking Status
firefox54 --- affected

People

(Reporter: larsberg, Unassigned)

Details

(Whiteboard: [gfx-noted])

Hardware: Windows 10 on a Razer Blade 14" with Nvidia 1060 mobile connected to an HTC Vive. By default, Firefox will load in integrated GPU mode for this device. Even when loading WebGL / WebVR content, we are not forced into discrete GPU mode. This is a rough experience for users, as they will see the content in-browser but it will just silently fail to render to the headset. The only indication as to what has gone wrong is a line about "rendering to the wrong device" in the text log files of the Steam VR client. Can we either force discrete or warn the user if we detect this situation?
Indeed, it appears that my confusion is because we only have a fix for this for WebGL on OSX. That is, whenever we create a WebGL context, if it's on OSX we do this: https://dxr.mozilla.org/mozilla-central/source/dom/canvas/WebGLContext.h#2035 And that creates a C++ object (reused from Chromium) that ensures that we force discrete GPU mode: https://dxr.mozilla.org/mozilla-central/source/gfx/gl/ForceDiscreteGPUHelperCGL.h So I guess the question is if we can do something similar on Windows? It seems a shame to do what we're doing right now, which is to inform the user that they should just set Firefox to always use the discrete GPU all the time.
Hrm, I don't see anything that we can do to dynamically switch it on Windows: http://developer.download.nvidia.com/devzone/devcenter/gamegraphics/files/OptimusRenderingPolicies.pdf
The dynamic GPU switching mechanism is different between OSX and Windows platform. Under Mac OSX, the hardware does have some internal switching mechanism so we can make discrete GPU render to the screen. On the other hand, for Windows or NVidia Optimus, discrete GPU doen't connect to the screen. When we "switch" to discrete GPU under these platform, the driver render graphics stuff by discrete GPU and copy it to integrate GPU for output to screen, so the output device is still integrated GPU. It is quite confusing when we got an identical laptop and ran SteamVR benchmark, the score is quite good but the output device is still Intel HD530.
If this was possible, we would need to first qualify both GPUs at start up (check for blocklisting, etc.), not just act on it because it's available. There are some systems that explicitly disallow Firefox on the discrete GPU (unless you rename firefox.exe to something else.) Some systems have the ability to force a particular GPU for a particular application, and we could inform users of this if we detect that they are on a dual GPU system, and currently using Intel. That leaves us with dual-AMD systems, where we would either have to maintain a database to determine if they're on the integrated GPU or not, or do some other kind of checks. Perhaps what Michael mentions in comment 3 could somehow be used for it.
Priority: -- → P5
Whiteboard: [gfx-noted]
Thanks for the clarifications! It sounds like we should probably do something when the user attempts to enable the headset but we're stuck in integrated GPU output mode to detect and inform the user. I'd definitely be worried about just always starting up in discrete mode at launch on Windows if, say, VR is enabled (especially since that flag will be riding the train shortly). :kip, what are your thoughts?
Flags: needinfo?(kgilbert)
I'd put the "if GPU#1 is 8086 and GPU#2 exists", inform the user when they first try to use VR that they should probably switch, and point them to a sumo page with the details. For example, if it's Nvidia as GPU#2, the page would tell them to open Nvidia Control Panel, go to Manage 3D Settings, click on Program Settings tab find Firefox, choose "High-performance NVIDIA Processor" as the preferred graphics processor, and restart Firefox. With a "don't show this again" preference, of course :)
(In reply to Lars Bergstrom (:larsberg) from comment #5) > Thanks for the clarifications! It sounds like we should probably do > something when the user attempts to enable the headset but we're stuck in > integrated GPU output mode to detect and inform the user. > > I'd definitely be worried about just always starting up in discrete mode at > launch on Windows if, say, VR is enabled (especially since that flag will be > riding the train shortly). > > :kip, what are your thoughts? Luckily most laptops will automatically switch to the discrete graphics card when a VR headset is plugged in, but there are exceptions. I think UI detecting the intel/0x8086 GPU and informing users is a great idea. It could be combined with other indicators of VR-readiness status in a "traffic light" interface similar to SteamVR's. Only addition I would suggest is a way to remotely disable this check or to push updates to the list of GPU's that trigger this. We should be prepared for the event that Intel may start supporting VR API's ;-)
Flags: needinfo?(kgilbert)
Should we ask nvidia if there is something better that we could be doing here to signal that we should be using the discrete GPU for VR? Or that we should be doing for dynamic switching? It appears that Edge is correctly doing this for Mixed Reality headsets (or so I'm hearing from initial reports) and most of the major negative user experiences stem from this: - No output in the VR headset (due to optimus) - Poor performance in the headset (due to power management kicking in) I've personally been running with the "High-performance NVIDIA Processor" set as preferred, but that both results in screen tearing on the laptop listed in this bug and causes Firefox to burn through the battery, so it's not ideal unless the machine is a desktop pretty much exclusively used for VR.
(In reply to Lars Bergstrom (:larsberg) from comment #8) > Should we ask nvidia if there is something better that we could be doing > here to signal that we should be using the discrete GPU for VR? Or that we > should be doing for dynamic switching? It appears that Edge is correctly > doing this for Mixed Reality headsets (or so I'm hearing from initial > reports) and most of the major negative user experiences stem from this: > - No output in the VR headset (due to optimus) > - Poor performance in the headset (due to power management kicking in) > > I've personally been running with the "High-performance NVIDIA Processor" > set as preferred, but that both results in screen tearing on the laptop > listed in this bug and causes Firefox to burn through the battery, so it's > not ideal unless the machine is a desktop pretty much exclusively used for > VR. Yes, I think it would be useful to get help from NVidia here. There are some things we know we can fix on our end already (using the display adapter that the Oculus SDK requests we use). There are also things that will need help from NVidia (How can we request to disable power management throttling for the process rendering frames when we are submitting them to the VR rendering API in another process?)
I would also suggest we ask about the reasons for the default NVidia application setting for firefox.exe, that overrides the user's default and sets it to "Adaptive" power savings mode. Perhaps we can make Firefox cooperate better with power management so we can get the performance when we really need it.
Severity: normal → S3
Component: Graphics → WebVR
You need to log in before you can comment on or make changes to this bug.