VP9/H264 colorspace BT.2020 is not respected

NEW
Unassigned

Status

()

Core
Graphics
P1
normal
a year ago
14 days ago

People

(Reporter: Vittorio Giovara, Unassigned)

Tracking

49 Branch
Points:
---
Dependency tree / graph

Firefox Tracking Flags

(Not tracked)

Details

Attachments

(3 attachments)

(Reporter)

Description

a year ago
Created attachment 8794979 [details]
bt2020.webm

User Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.11; rv:49.0) Gecko/20100101 Firefox/49.0
Build ID: 20160916101415

Steps to reproduce:

1. Create VP9 webm video in BT.2020 colorspace and tag it so:
ffmpeg -loglevel verbose -f lavfi -r 1 -i testsrc=s=1280x720 -vf scale=out_color_matrix=bt601,format=yuv420p -colorspace bt470bg -c:v libvpx-vp9 -lossless 1 -t 10 -y bt601.webm

2. Create same VP9 webm video but in BT.709 colorspace and tag it so:
ffmpeg -loglevel verbose -f lavfi -r 1 -i testsrc=s=1280x720 -vf scale=out_color_matrix=bt2020,format=yuv420p -colorspace bt2020 -c:v libvpx-vp9 -lossless 1 -t 10 -y bt2020.webm


Actual results:

Firefox always uses BT.601 colorspace for VP9 videos and thus BT.2020 video is darker than it should be.


Expected results:

VP9 colorspace property should be respected, i.e. tagged BT.601 should looks like tagged BT.2020.
Component: Untriaged → Audio/Video: Playback
Product: Firefox → Core
Priority: -- → P1
Depends on: 1210357
(Reporter)

Comment 1

a year ago
typo in the command line, it should be "-colorspace bt2020_ncl"
Sotaro, 
Do you think this bug can be closed since bug 1210357 has been fixed?
Flags: needinfo?(sotaro.ikeda.g)
(In reply to Blake Wu [:bwu][:blakewu] from comment #2)
> Sotaro, 
> Do you think this bug can be closed since bug 1210357 has been fixed?

No, we could not close this bug. We need to add additional color matrix handlings like Bug 1210357 and Bug 1306521.

I am going to work on it.
Assignee: nobody → sotaro.ikeda.g
Status: UNCONFIRMED → ASSIGNED
Ever confirmed: true
Flags: needinfo?(sotaro.ikeda.g)
Great to have you work on this. Thanks!
Depends on: 1329383
Component: Audio/Video: Playback → Graphics
See Also: → bug 1304330
Blocks: 1215089
:sotaro, do you plan working on this soon or should we get someone to look into it?
Flags: needinfo?(sotaro.ikeda.g)
I do not have a plan to work for it soon, since I am busy.
Flags: needinfo?(sotaro.ikeda.g)
Hi Bruce,
Are you available to do this?
Flags: needinfo?(brsun)

Comment 8

6 months ago
Thanks for the information from :jerry, there is one existing online tool[1] to show coefficients to transform between RGB and YCbCr with respect to different specs. I can try to dig into this bug in the near future.

[1] https://jdashg.github.io/misc/colors/from-coeffs.html
Flags: needinfo?(brsun)

Comment 9

6 months ago
I'll study bug 1210357 first before doing any modification.
Assignee: sotaro.ikeda.g → brsun

Comment 10

5 months ago
Created attachment 8897330 [details] [diff] [review]
bug1305510_bt_2020_color_space.wip.patch

WIP patch. Note: There are many missing items from this patch, i.e. the matrix for BT2020_CL, BT.2020 support in libyuv, correct color space information of IMFYCbCrImage, etc.


Hi Sotaro,

Would you mind having a glance on this baby-step patch and share your comments if any?
Attachment #8897330 - Flags: feedback?(sotaro.ikeda.g)
Comment on attachment 8897330 [details] [diff] [review]
bug1305510_bt_2020_color_space.wip.patch

Review of attachment 8897330 [details] [diff] [review]:
-----------------------------------------------------------------

:brsun, is there a reason not to handle software color conversion? It is used when layer is not hw accelerated.

::: gfx/layers/D3D11YCbCrImage.h
@@ +21,5 @@
>  {
>  public:
>    explicit D3D11YCbCrRecycleAllocator(KnowsCompositor* aAllocator,
> +                                      ID3D11Device* aDevice,
> +                                      YUVColorSpace aYUVColorSpace)

Is there reason to add YUV color space to hare?

SharedPlanarYCbCrImage case, the YUV color space is checked only during CreateOrRecycle() like the following.

https://dxr.mozilla.org/mozilla-central/source/gfx/layers/ipc/SharedPlanarYCbCrImage.cpp#192

https://dxr.mozilla.org/mozilla-central/source/gfx/layers/client/TextureClientRecycleAllocator.cpp#100

::: gfx/layers/ImageContainer.h
@@ +558,5 @@
>    }
>  
>  #ifdef XP_WIN
>    D3D11YCbCrRecycleAllocator* GetD3D11YCbCrRecycleAllocator(
> +    KnowsCompositor* aAllocator, YUVColorSpace aYUVColorSpace);

Is there a reason to add YUVColorSpace as argument?

::: gfx/thebes/gfxUtils.cpp
@@ +1150,3 @@
>  
>  /* static */ const float*
>  gfxUtils::YuvToRgbMatrix4x3RowMajor(YUVColorSpace aYUVColorSpace)

Changes around here conflicts with Bug 1322746. You might want to get feedback from :jgilbert.
Attachment #8897330 - Flags: feedback?(sotaro.ikeda.g)

Comment 12

5 months ago
> :brsun, is there a reason not to handle software color conversion? It is
> used when layer is not hw accelerated.

This would require some modification in libyuv, and I am still investigating how to do it properly.

> ::: gfx/layers/ImageContainer.h
> @@ +558,5 @@
> >    }
> >  
> >  #ifdef XP_WIN
> >    D3D11YCbCrRecycleAllocator* GetD3D11YCbCrRecycleAllocator(
> > +    KnowsCompositor* aAllocator, YUVColorSpace aYUVColorSpace);
> 
> Is there a reason to add YUVColorSpace as argument?

Because I have to pass YUVColorSpace through TextureData[1] while creating TextureHost at the compositor side. One another approach is to add one YUVColorSpace argument at Allocate() function[2][3], but this approach might violate the design of this function since YUVColorSpace is required for YCbCr texture only. By storing YUVColorSpace at D3D11YCbCrRecycleAllocator, we can pass YUVColorSpace to TextureData without pollute the argument list of Allocate().

[1] https://dxr.mozilla.org/mozilla-central/source/gfx/layers/D3D11YCbCrImage.cpp#300
[2] https://dxr.mozilla.org/mozilla-central/source/gfx/layers/D3D11YCbCrImage.cpp#268
[3] https://dxr.mozilla.org/mozilla-central/source/gfx/layers/client/TextureClientRecycleAllocator.h#112

Comment 13

5 months ago
In WMF path, I always get MF_E_ATTRIBUTENOTFOUND as HRESULT while getting MF_MT_YUV_MATRIX information or MF_MT_TRANSFER_FUNCTION information either from:
 - IMFMediaType from MFTDecoder::GetOutputMediaType(), or
 - IMFSample from MFTDecoder::Output().

Comment 14

5 months ago
unassign myself since I am not going to working on this bug
Assignee: brsun → nobody
Status: ASSIGNED → NEW
(Reporter)

Comment 15

2 months ago
Hi, is there any chance this will be picked up in the future again, or will this be implemented as part of libplacebo?

Comment 16

18 days ago
Created attachment 8940090 [details] [diff] [review]
8 bit-depth_BT2020_SWC

Hi Sotaro, Jeff,

I tried to pick this up and found the color-depth of this attached file "bt2020.webm" is 8-bit.  So I gave up to write a 16-bit sw conversion in libyuv and rely on the existing conversion workflow with an updated coefficient matrix (from Jeff's site [1]) .

Though AFAIK, for bt.2020, it should be either 10bit or 12bit per channel and we should applied an EOTF (PQ or HLG) instead of conventional SDR gamma curve when the content is gonna be displayed.  I am still trying to figure this out.

Would you like to give me some advices about this patch ?
Do we need to have a conversion for 8-bit bt.2020 content ? It seems that there's no related work in libyuv.

My next step is to add the bt.2020 coefficient matrix in gfx/thebes/gfxUtils.cpp like [2], so that we could do this conversion via shader.  Is this the right way to go ?

[1] https://jdashg.github.io/misc/colors/from-coeffs.html
[2] https://searchfox.org/mozilla-central/rev/652fbd6270de0d3ec424d2b88f8375ff546f949f/gfx/thebes/gfxUtils.cpp#1120
Attachment #8940090 - Flags: feedback?(sotaro.ikeda.g)
Attachment #8940090 - Flags: feedback?(jgilbert)
Yeah, 2020 technically includes a different gamma curve. This is a post-process after converting with the color-space matrix, and can't be included in the matrix because, well, curves are non-linear.

You almost certainly want to reuse GLBlitHelper[1] rather than implementing the shaders yourself.

The issue with gamma is that right now we pretend that everything is gamma=1, which works sorta mostly ok. We can hack the 2020 to work by converting to RGB and scaling by the 2020 curves, and then /de-scale/ back, knowing that we'll be treating it like it's gamma=2.2.

This is pretty iffy without doing a full sRGB audit of our stack, which I don't believe we've done.

[1]: https://dxr.mozilla.org/mozilla-central/source/gfx/gl/GLBlitHelper.h?q=GLBlitHelper&redirect_type=direct#120
Oh wait, 2020 has the same gamma curve as 601 and 709, though it does give us better precision on the coeffs. It's rec2100 that introduces HLG and PQ for HDR. (as well as ICtCp, which I think we can skip for now?)

2020 is indeed 10 and 12 bit, so I'm not really sure what 8-bit 2020 strictly means. 8-bit 2020 is worse than 8-bit 709, so it shouldn't be something we encourage. We could treat it as 8-bit with the 2020, but I think we should try to standardize on rejecting it if we can.
So constant-luma means we're taking in y'u'v' data, converting to linear yuv, /then/ transforming to rgb and converting back to r'g'b' (srgb).

But for both CL and NCL, the 2020 primaries are different enough from 709's that we need convert from the encoding primaries to our monitor primaries with another affine matrix. (sRGB uses the rec709 primaries, so we don't really need to do this transform) If we don't do the primary transform, we'll get washed out colors because 100% green rec2020 is like 200% green rec709. If we clamp after primary transformation, that will fix this. (though it will crush highs)
(Reporter)

Comment 20

15 days ago
(In reply to Jeff Gilbert [:jgilbert] from comment #19)
> So constant-luma means we're taking in y'u'v' data, converting to linear
> yuv, /then/ transforming to rgb and converting back to r'g'b' (srgb).

Is that so? I was under the impression that constant luma meant taking in y'u'v' data, converting to r'g'b', and then linearizing to rgb (which can then be rendered to screen with srgb or something else).

+cc tdaede
Vittorio, what you describe is the "normal" NCL case that's used for rec.709 etc.

So I actually think my description to jgilbert earlier was a bit wrong, I had missed how the chroma components are computed. The Y signal is computed from linear RGB and then converted to Y', and U' and V' are calculated from Y', R', and B'. To do the inverse, you first get back the R' and B', then linearize to get YRB, then get RGB, and finally convert back to R'G'B' (assuming you want a gamma corrected output and not linear).

See also https://github.com/haasn/libplacebo/blob/master/src/shaders/colorspace.c#L59
You need to log in before you can comment on or make changes to this bug.