Closed Bug 904890 Opened 6 years ago Closed 6 years ago

Fast video via a video memory d3d9 texture client

Categories

(Core :: Graphics: Layers, defect)

x86_64
Windows XP
defect
Not set

Tracking

()

RESOLVED FIXED
mozilla31

People

(Reporter: nrc, Assigned: mattwoodrow)

References

Details

Attachments

(5 files, 4 obsolete files)

For DXVA video on OMTC we need a d3d9 texture client and host which uses video memory rather than systemmem. Interestingly, this is going to have to work with the d3d11 compositor.
See Bug 875247 for an existing but apparently slightly buggy d3d11 version. We might not need thatif we can just use the d3d9 version everywhere.
Just a note that we are probably better off doing this with a new texture client, rather than the deprecated ones since we are already half way to using the new flavour on Windows.
Adding dependency on new-texture for d3d11.
Assignee: ncameron → matt.woodrow
Depends on: 938591
Attachment #8346360 - Flags: review?(nical.bugzilla)
Attachment #8346361 - Flags: review?(nical.bugzilla)
The textures being produced from the hardware video decoder don't have locking enabled. Looks like we manually synchronize on the client side before sending them to ensure that nothing races.
Attachment #8346362 - Flags: review?(bas)
Attachment #8346362 - Flags: review?(bas) → review+
Comment on attachment 8346363 [details] [diff] [review]
Part 4: Enable hardware accelerated video decoding for OMTC+D3D11

Review of attachment 8346363 [details] [diff] [review]:
-----------------------------------------------------------------

::: content/media/wmf/WMFReader.cpp
@@ +124,5 @@
>  
>    if (layerManager->GetBackendType() != LayersBackend::LAYERS_D3D9 &&
> +      layerManager->GetBackendType() != LayersBackend::LAYERS_D3D10 &&
> +      !(layerManager->GetBackendType() == LayersBackend::LAYERS_CLIENT &&
> +        layerManager->AsShadowForwarder()->GetCompositorBackendType() == LayersBackend::LAYERS_D3D11)) {

We have other code using DXVA: WMFVideoDecoder in content/media/fmp4/wmf.

For this, we extract the LayersBackend type in MP4Reader::InitLayersBackendType() and pass it through PlatformDecoderModule::CreateH264Decoder() (called by MP4Reader::ReadMetadata()) so that WMFVideoDecoder::InitializeDXVA(mozilla::layers::LayersBackend aLayersBackend) receives it and can decide whether to use DXVA.

We should also extract and pass the compositor backend type through this call chain too. Please let me know if that's a dumb idea. I'll write a patch to do that.
Attachment #8346363 - Flags: feedback+
Needinfo myself so bugzilla nags me to follow up on comment 8.
Flags: needinfo?(cpearce)
(In reply to Chris Pearce (:cpearce) from comment #8)
 
> We have other code using DXVA: WMFVideoDecoder in content/media/fmp4/wmf.
> 
> For this, we extract the LayersBackend type in
> MP4Reader::InitLayersBackendType() and pass it through
> PlatformDecoderModule::CreateH264Decoder() (called by
> MP4Reader::ReadMetadata()) so that
> WMFVideoDecoder::InitializeDXVA(mozilla::layers::LayersBackend
> aLayersBackend) receives it and can decide whether to use DXVA.
> 
> We should also extract and pass the compositor backend type through this
> call chain too. Please let me know if that's a dumb idea. I'll write a patch
> to do that.

Not sure if this is what you mean, but I think we can just pass the compositor backend in place of the layer backend, if the real layers backend is LAYERS_CLIENT. No point passing around two values, when all we really care about is the 'effective' backend (which may we should have LayerManager API for).
If you think that there's zero chance of the two cases layer manager's LayersBackend==LAYERS_D3D11 and GetCompositorBackendType()==LAYERS_D3D11 being confused and D3D9 surfaces not being handled correctly, then I'm cool with just passing the 'effective' backend type.

Can we handle D3D9 surfaces in the D3D11 layer manager backend? Or does (and will) the layer manager never use a D3D11 backend, only the compositor?
(In reply to Chris Pearce (:cpearce) from comment #11)
> If you think that there's zero chance of the two cases layer manager's
> LayersBackend==LAYERS_D3D11 and GetCompositorBackendType()==LAYERS_D3D11
> being confused and D3D9 surfaces not being handled correctly, then I'm cool
> with just passing the 'effective' backend type.
> 
> Can we handle D3D9 surfaces in the D3D11 layer manager backend? Or does (and
> will) the layer manager never use a D3D11 backend, only the compositor?

There is no d3d11 layer manager - only d3d10 (and there is no d3d10 compositor).
Comment on attachment 8346360 [details] [diff] [review]
Part 1: Add a TextureClient for shared d3d9 surfaces

Review of attachment 8346360 [details] [diff] [review]:
-----------------------------------------------------------------

r=me with comments fixed

::: gfx/layers/d3d9/TextureD3D9.h
@@ +138,5 @@
>    SurfaceInitMode mInitMode;
>    bool mInitialized;
>  };
>  
> +class TextureClientD3D9Shared : public TextureClient

I'd prefer that we stick to the following convention: <type>TextureClient<backend> like SharedTextureClientD3D9 (deprecated textures don't always do that but the new ones do)

@@ +166,5 @@
> +    mHandle = aSharedHandle;
> +    mDesc = aDesc;
> +  }
> +
> +  virtual gfx::IntSize GetSize() const

MOZ_OVERRIDE
Attachment #8346360 - Flags: review?(nical.bugzilla) → review+
Comment on attachment 8346361 [details] [diff] [review]
Part 2: Implement ISharedImage for D3D9SurfaceImage

Review of attachment 8346361 [details] [diff] [review]:
-----------------------------------------------------------------

::: gfx/layers/D3D9SurfaceImage.h
@@ +31,5 @@
>  
> +  D3D9SurfaceImage();
> +  virtual ~D3D9SurfaceImage();
> +
> +  virtual ISharedImage* AsSharedImage() { return this; }

nit: MOZ_OVERRIDE

@@ +60,5 @@
>    // is complete, whereupon the texture is safe to use.
>    void EnsureSynchronized();
>  
>    gfxIntSize mSize;
>    RefPtr<IDirect3DTexture9> mTexture;

It would be even better if accessing this texture had to go through the TextureClient. Otherwise the TextureClient can't ensure that the the memory is managed properly (which may not be critical in this particular case, but it's a good practice).
Attachment #8346361 - Flags: review?(nical.bugzilla) → review+
Comment on attachment 8346363 [details] [diff] [review]
Part 4: Enable hardware accelerated video decoding for OMTC+D3D11

Review of attachment 8346363 [details] [diff] [review]:
-----------------------------------------------------------------

::: content/media/wmf/WMFReader.cpp
@@ +124,5 @@
>  
>    if (layerManager->GetBackendType() != LayersBackend::LAYERS_D3D9 &&
> +      layerManager->GetBackendType() != LayersBackend::LAYERS_D3D10 &&
> +      !(layerManager->GetBackendType() == LayersBackend::LAYERS_CLIENT &&
> +        layerManager->AsShadowForwarder()->GetCompositorBackendType() == LayersBackend::LAYERS_D3D11)) {

No d3d9 implementation? :-(
Attachment #8346363 - Flags: review?(ncameron) → review+
(In reply to Nick Cameron [:nrc] from comment #15)
> Comment on attachment 8346363 [details] [diff] [review]
> Part 4: Enable hardware accelerated video decoding for OMTC+D3D11
> 
> Review of attachment 8346363 [details] [diff] [review]:
> -----------------------------------------------------------------
> 
> ::: content/media/wmf/WMFReader.cpp
> @@ +124,5 @@
> >  
> >    if (layerManager->GetBackendType() != LayersBackend::LAYERS_D3D9 &&
> > +      layerManager->GetBackendType() != LayersBackend::LAYERS_D3D10 &&
> > +      !(layerManager->GetBackendType() == LayersBackend::LAYERS_CLIENT &&
> > +        layerManager->AsShadowForwarder()->GetCompositorBackendType() == LayersBackend::LAYERS_D3D11)) {
> 
> No d3d9 implementation? :-(

Not until someone writes new-textures for d3d9!
Carrying forward r=nical
Attachment #8346360 - Attachment is obsolete: true
Attachment #8346361 - Attachment is obsolete: true
Attachment #8362838 - Flags: review+
Carrying forward r=Bas
Attachment #8346362 - Attachment is obsolete: true
Attachment #8362840 - Flags: review+
Attachment #8346363 - Attachment is obsolete: true
Attachment #8362843 - Flags: review?(cpearce)
Attachment #8362844 - Flags: review?(nical.bugzilla)
Depends on: 957560
With the above patch queue and new-textures enabled, HW accelerated video decoding works on both d3d11 and d3d9 compositors.
The patches don't break things without new-textures enabled, but we hit the readback path D3D9SurfaceImage::DeprecatedGetAsSurface.

According to the comments in content/media this will perform worse than just decoding in software to begin with.

So if we're planning to ship d3d9/11 without new-textures enabled, then landing of this should probably wait until we flip the pref. Otherwise this can land ASAP.
Attachment #8362843 - Flags: review?(cpearce) → review+
Comment on attachment 8362844 [details] [diff] [review]
Part 5: Add a d3d9 texture host

Review of attachment 8362844 [details] [diff] [review]:
-----------------------------------------------------------------

r=me with the surface format change below.

::: gfx/layers/d3d9/TextureD3D9.cpp
@@ +79,5 @@
> +      break;
> +    }
> +    case SurfaceDescriptor::TSurfaceDescriptorD3D10: {
> +      result = new DXGITextureHostD3D9(aFlags, aDesc);
> +      break;

Wooops! That was my bad.

@@ +1678,5 @@
> +    HRESULT hr = deviceManager->device()->CreateTexture(mSize.width,
> +                                                        mSize.height,
> +                                                        1,
> +                                                        D3DUSAGE_RENDERTARGET,
> +                                                        D3DFMT_X8R8G8B8,

Please use SurfaceFormatToD3D9Format(mFormat) here, instead of hardcoding XRGB.
Attachment #8362844 - Flags: review?(nical.bugzilla) → review+
Flags: needinfo?(cpearce)
Comment on attachment 8362842 [details] [diff] [review]
Part 3: Add a layers API to expose the 'effective' backend

Review of attachment 8362842 [details] [diff] [review]:
-----------------------------------------------------------------

r+ with that

::: gfx/layers/Layers.h
@@ +422,5 @@
> +   * Type of layers backend that will be used to composite this layer tree.
> +   * When compositing is done remotely, then this returns the layers type
> +   * of the compositor.
> +   */
> +  virtual LayersBackend GetEffectiveBackendType() { return GetBackendType(); }

I think "GetCompositorBackendType" would be a better name.
Attachment #8362842 - Flags: review?(roc) → review+
Depends on: 993619
You need to log in before you can comment on or make changes to this bug.