Closed Bug 1469711 Opened 3 years ago Closed 3 years ago

Add YUV 10/12 bits to RGB 8 bits conversion in D3D11 compositor

Categories

(Core :: Graphics, enhancement, P3)

enhancement

Tracking

()

RESOLVED FIXED
mozilla64
Tracking Status
firefox64 --- fixed

People

(Reporter: jya, Assigned: jya)

References

(Blocks 1 open bug)

Details

(Whiteboard: [gfx-noted])

Attachments

(3 files)

We have a 10/12 bits rendering path on macos and linux (OpenGL and basic compositor) by converting YUV 10/12 bits into RGB 8 bits directly in the compositor (through an OpenGL shader)

We have no such feature in the D3D11 compositor path.

While this won't provide proper HDR support obviously, it would at least allow to play such file on Windows like we already can on mac and linux.
Sure, why not, Jean-Yves. Can you provide me with a video that the rest of our pipeline supports?
Assignee: nobody → bas
Status: NEW → ASSIGNED
in bug 1215089 there will be a sample for every type that exists. It's also the bug that added support for those on linux/mac

To make sure you get to the code path I'm referring to in this bug (conversion of software YUV buffer into RGB) you'll need to disable the WMF VP9 decoder by changing the pref: media.wmf.vp9.enabled to false

This is a must if you're using a nvidia > 9xx/10xx or a 7th gen Intel (i5/7-7xxx)

The VP9 WMF decoder returns a D3D11/DXVA surface, and we haven't done anything to support more than 8 bits, I'm guessing that would be a lot of work.
Priority: -- → P3
Whiteboard: [gfx-noted]
Blocks: 1486288
This change is for D3D11 with Advanced Layers enabled.
Assignee: bas → jyavenard
The D3D11 compositor now supports 10 and 12 bits image

Depends on D6498
See Also: → 1493198
Comment on attachment 9010982 [details]
Bug 1469711 - P3. Allow HDR vp9 decoding with D3D11 compositor

Bryce Seager van Dyk (:bryce) has approved the revision.
Attachment #9010982 - Flags: review+
Comment on attachment 9010943 [details]
Bug 1469711 - P1. Add 10/12 bits YUV support to D3D11 compositor.

Matt Woodrow (:mattwoodrow) has approved the revision.
Attachment #9010943 - Flags: review+
Comment on attachment 9010980 [details]
Bug 1469711 - P2. Add 10/12 bits YUV support to D3D11 legacy compositor.

Matt Woodrow (:mattwoodrow) has approved the revision.
Attachment #9010980 - Flags: review+
Pushed by jyavenard@mozilla.com:
https://hg.mozilla.org/integration/autoland/rev/78ea76d2f7f3
P1. Add 10/12 bits YUV support to D3D11 compositor. r=mattwoodrow
https://hg.mozilla.org/integration/autoland/rev/417a98e079bb
P2. Add 10/12 bits YUV support to D3D11 legacy compositor. r=mattwoodrow
https://hg.mozilla.org/integration/autoland/rev/08af19baa52a
P3. Allow HDR vp9 decoding with D3D11 compositor r=bryce
https://hg.mozilla.org/mozilla-central/rev/78ea76d2f7f3
https://hg.mozilla.org/mozilla-central/rev/417a98e079bb
https://hg.mozilla.org/mozilla-central/rev/08af19baa52a
Status: ASSIGNED → RESOLVED
Closed: 3 years ago
Resolution: --- → FIXED
Target Milestone: --- → mozilla64
jya correct me if I am wrong, but this code will always apply the full range conversion from studio range, even when the input is already full range.

  yuv = yuv * fCoefficient - float3(0.06275, 0.50196, 0.50196);

So while this will work for the most common "standard" case, it will result in a wrong conversion if people encode full range h264/vp9 (which often happens for screen captures). Should a separate bug be opened?
Flags: needinfo?(jyavenard)
(In reply to Vittorio Giovara from comment #11)
> jya correct me if I am wrong, but this code will always apply the full range
> conversion from studio range, even when the input is already full range.
> 
>   yuv = yuv * fCoefficient - float3(0.06275, 0.50196, 0.50196);
> 
> So while this will work for the most common "standard" case, it will result
> in a wrong conversion if people encode full range h264/vp9 (which often
> happens for screen captures). Should a separate bug be opened?

our colour handling is wrong to start with, and we have no proper support for studio vs pc range.
We also can't even detect under most cases what the graphic card is actually using. There doesn't appear to be a universal API to determine it. AMD, Intel and nVidia AFAIK use different mechanism.

For nvidia you force the pc vs studio range in their settings.

Considering it's your area of expertise, I'd love to work with you to define the proper behaviour. Provides me with the rights matrices to use etc.

webrender uses floating points whose value are between 0 and 1 for describing a color. I don't even know if you can apply PC v Studio range there.

I'm available to talk on Mozilla's IRC in #media
Flags: needinfo?(jyavenard)
You need to log in before you can comment on or make changes to this bug.