WebGL contexts should not default to antialias: true on low-spec graphics hardware

NEW
Unassigned

Status

()

Core
Canvas: WebGL
6 years ago
4 years ago

People

(Reporter: kael, Unassigned)

Tracking

Firefox Tracking Flags

(Not tracked)

Details

(Whiteboard: webgl-perf)

(Reporter)

Description

6 years ago
At present, both Firefox and Chrome default to antialias: true for WebGL contexts if the getContext() call doesn't specify a preference. I understand that this is probably in order to ensure that every user gets great-looking WebGL rendering across the board, regardless of hardware, but I think it's a really poor choice.

For my laptop (with Intel integrated graphics), this basically cripples my rendering performance in any WebGL-based demo or workload - in both browsers. If I modify the demos to force antialiasing off, my framerate triples or quadruples, because suddenly the Intel GPU isn't struggling under an enormous memory bandwidth deficit anymore.

I can't think of a perfect solution to this problem, but defaulting antialiasing on for cards that can't handle it sucks. You could probably start with 'if vendor is Intel, turn it off' to improve things for people with intel GPUs, since they're universally slow, but it would probably help to default-off for low-spec NVidia and ATI GPUs as well (mobile ones, etc) - and I imagine that for users running Firefox on a laptop, defaulting antialiasing off would improve battery life.
Notice that {antialias:true} is the default per spec as a context creation flag, and means "try to get antialiasing, but I understand there's no guarantee I'll get it". The other option, {antialias:false} means as a context creation flag: "absolutely do not antialias". There is no context creation flag to mean "do antialias, if possible at all". So if we disabled antialiasing on machines where it's slow, then content would have no way to insist on enabling it. That means that we would break existing, legitimate content that currently works well.

Two possible approaches here:

 1) either we let that entirely up to the application to decide: the application would decide if it wants antialiasing or not based on performance data, that it can obtain by running a benchmark itself (doable already today) or by querying a browser API for that (that has yet to be spec'd). Filed bug 765755 for that.

 2) or we decide that it is intolerable for 2x2 msaa to have more than, say, a 50% performance overhead on a typical scene, and so we call broken any implementation where the overhead is bigger, and so we blacklist antialiasing on these machines. It would still have to be decided how we maintain that blacklist. Doing Intel chips is a good place to start, but there are also low-end old-ish discrete chips, NVIDIA integrated graphics from a few years ago, etc. Also, newer/future Intel chips might have good antialiasing performance, so all in all, this will be an expensive blacklist to maintain. Furthermore, antialiasing is only one of many performance issues on underpowered hardware. For these reasons I would prefer option 1).
I'd suggest having a pref -- internal only at first, but eventually exposed to the user, where they can choose what their default is for quality/performance tradeoff options.  We can then use the value of the pref when content says 'true', sort of like the global overrides in nvidia's control panel.
Yes, that would work.

Unfortunately we only have boolean prefs so we can't simply have a 3-state pref like quality/default/speed.

We could have two prefs: webgl.prefer-quality and webgl.prefer-speed, both defaulting to false.

Notice that for just the special case here of disabling antialiasing, there is already a pref: webgl.msaa-level=0
The problem with this is that it's program-specific. If you're trying to render a static scene, then clearly it should use anti-aliasing everywhere it's supported, because it'll look better, and taking longer to do so is not a problem.

Apps which care about performance should take into consideration that even on high-performance dedicated hardware, AA takes longer to render with, not to mention the lower end, where it's crippling.

As the spec stands today, we have to default to AA if we support it at all. (not specifying antialias:true is implicitly antialias:true) We could talk with the WG, but the definition of what is 'slow' enough to warrant falling back to non-AA is, I believe, too vague to be properly defined. The only one that can truly decide is the app itself.

I think this should be RESOLVED INVALID or WONTFIX, but we can discuss it a bit more before making that choice.
I don't know; I think it should be perfectly valid for us to treat the lack of an explicit antialias: true as a "you figure it out" for the browser.  If the app cared one way or the other, it should really specify it explicitly; since it doesn't, it obviously doesn't care (but at least might prefer to get AA).
(In reply to Vladimir Vukicevic [:vlad] [:vladv] from comment #5)
> I don't know; I think it should be perfectly valid for us to treat the lack
> of an explicit antialias: true as a "you figure it out" for the browser.  If
> the app cared one way or the other, it should really specify it explicitly;
> since it doesn't, it obviously doesn't care (but at least might prefer to
> get AA).

If we want to do that, we need to (de)normalize that part of the spec. As it stands, I believe it is required that antialias:undefined needs to have the same result as antialias:true.
(Reporter)

Comment 7

6 years ago
To be clear, this bug is predicated on the idea that a default of antialias:true is bad for the *user*, not that it's in violation of the spec or bad for the browser author. I think the spec may be this way for a good reason, but it produces nasty consequences for both developers (who now need to be aware of this corner case in the WebGL spec - I didn't know about it, and most of the other devs I've talked to didn't either) and end users (who will get subpar performance and worse battery usage/heat generation on their laptops in apps that accept the default).

If it's really unacceptable to change the default value, you could solve this problem with developer evangelism, I guess - if I had known about the default, it would have never affected my applications. Alternately, a survey of WebGL apps out in the field to see whether they configure their WebGL contexts could prove that I'm wrong - I'm not exactly working off data here. If 99% of WebGL apps actually specify antialias: true or antialias: false, then changing the default wouldn't do anything.
One problem is that I don't think apps have a way to have sufficient knowledge to decide what to do with AA, short of just providing an on/off toggle to the user.  I'm suggesting we move that on/off toggle to somewhere in the browser chrome (in Content prefs, 3D preferences or something?), and use that setting when nothing is specified.

At one point, all of the getContext flags were merely hints, partially for this reason.. the browser should be free to choose some defaults that make sense in cases like this, IMO.  I still think that it would not be in violation of the spirit of the spec to disable AA by default on lower end hardware; if rendering grinds to a halt for an option that content didn't explicitly specify, we're not really doing anyone any favors by being pedantic about the interpretation...
I think developer evangelism is the way to go. We already document this in the spec, though the wording could perhaps be improved.

If the problem is that people just aren't checking the spec, then our only hope is to either get people to read it, or flood the market with very explicitly-detailed tutorials, so as to at least push cargo-culting in the right direction.
(In reply to Vladimir Vukicevic [:vlad] [:vladv] from comment #8)
> One problem is that I don't think apps have a way to have sufficient
> knowledge to decide what to do with AA, short of just providing an on/off
> toggle to the user.  I'm suggesting we move that on/off toggle to somewhere
> in the browser chrome (in Content prefs, 3D preferences or something?), and
> use that setting when nothing is specified.
> 
> At one point, all of the getContext flags were merely hints, partially for
> this reason.. the browser should be free to choose some defaults that make
> sense in cases like this, IMO.  I still think that it would not be in
> violation of the spirit of the spec to disable AA by default on lower end
> hardware; if rendering grinds to a halt for an option that content didn't
> explicitly specify, we're not really doing anyone any favors by being
> pedantic about the interpretation...

The context creation attributes are hints, but the hints have strictly specified defaults. Previously, some of the defaults were not defined, which ended up causing at least bug 745880, which resulted in a change to the spec. (hints now have required specified defaults)

The danger we run here is that if we loosen this requirement for the 'antialias' hint, we will get bugs like 'X demo looks better in browser Y'.
If we do it right, it'll be "X demo looks better in browser Y, but runs so much faster in Firefox!"
(In reply to Vladimir Vukicevic [:vlad] [:vladv] from comment #11)
> If we do it right, it'll be "X demo looks better in browser Y, but runs so
> much faster in Firefox!"

Except for the cases which don't care about framerate.

I still think it's a much deeper hole to do put together any sort of heuristic for this than to simply do evangelism about this, among other things. I think all we would need to do is publish some good basic tutorials which do everything very correctly and explicitly.

We could also put together a page detailing hints geared towards performant WebGL, of which this could be one point.
OS: Windows 7 → All
Hardware: x86_64 → All
Whiteboard: webgl-perf
Version: 15 Branch → unspecified
You need to log in before you can comment on or make changes to this bug.