Add AImageReader with MediaCodec support on Android
Categories
(Core :: Graphics, enhancement, P3)
Tracking
()
People
(Reporter: sotaro, Assigned: jnicol)
References
(Blocks 5 open bugs)
Details
Attachments
(8 obsolete files)
We could get each video frame by using AImageReader. It is a wrapper of BufferItemConsumer.
Chromium uses it since Android P. Though, it seems possible to enable it since Android O.
ImageReaderGLOwner creates it.
Reporter | ||
Updated•5 years ago
|
Reporter | ||
Comment 1•5 years ago
|
||
Comment from bug 1639280 comment 32
Maybe Chromium uses
ImageReader
only for Android P and later becauseImage::getHardwareBuffer()
is available since that version.The document does mention some use cases are not supported and
MediaCodec
is one of them. However, Android source code suggests that theHardwareBuffer
is created usingGraphicBuffer
and should be compatible withMediaCodec
.
Updated•5 years ago
|
Reporter | ||
Comment 2•5 years ago
|
||
Hmm, source code does not have a comment about MediaCodec and chromium media source seems to use AImageReader. A limitation might exist. Needs to look into more.
- http://androidxref.com/9.0.0_r3/xref/frameworks/av/media/ndk/include/media/NdkImage.h#763
- https://source.chromium.org/chromium/chromium/src/+/master:media/gpu/android/video_frame_factory_impl.cc;l=103
- https://source.chromium.org/chromium/chromium/src/+/master:gpu/command_buffer/service/texture_owner.cc;l=43
- https://source.chromium.org/chromium/chromium/src/+/master:media/gpu/android/video_frame_factory_impl.cc;l=36
Reporter | ||
Updated•5 years ago
|
Reporter | ||
Comment 3•5 years ago
|
||
Reporter | ||
Updated•5 years ago
|
Reporter | ||
Comment 4•5 years ago
|
||
Reporter | ||
Comment 5•5 years ago
|
||
Reporter | ||
Comment 6•5 years ago
|
||
Reporter | ||
Comment 7•5 years ago
|
||
Reporter | ||
Comment 8•5 years ago
|
||
Reporter | ||
Comment 9•5 years ago
|
||
Reporter | ||
Updated•5 years ago
|
Reporter | ||
Updated•5 years ago
|
Reporter | ||
Comment 10•5 years ago
|
||
Reporter | ||
Updated•5 years ago
|
Reporter | ||
Updated•5 years ago
|
Updated•5 years ago
|
Comment hidden (off-topic) |
Comment 12•3 years ago
|
||
Sorry, there was a problem with the detection of inactive users. I'm reverting the change.
Reporter | ||
Updated•3 years ago
|
Updated•1 year ago
|
Assignee | ||
Updated•1 year ago
|
Reporter | ||
Comment 13•9 months ago
|
||
When I implemented the wip patch, it worked well locally. But the patch caused out of descriptor crashes on some devices. And a lot of CI tests failures happened. The CI tests failures seemed to be related to GeckoSurface implementation. At the time, GeckoSurface was inherited from android Surface. It was changed as not to inherited from android Surface by Bug 1706656.
The file descriptor seemed to be increased by allocating GeckoSurface of video decoding. Passing AHardwareBuffer from content process to GPU process/Parent process and passing android Fence fd from parent side to content process seemed to increase file descirptor usage. Then it seems necessary to move the GeckoSurface allocation to GPU process to void AHardwareBuffer delivery via cross process IPC like hardware video decoding on Windows.
And gecko needs a mechanism to limit amount of dequeued video buffer. Some hardware caused a rendering problem when a lot of video buffers were dequeued from video decoder. In chromium, ImageReaderGLOwner does it. And NumRequiredMaxImages() decides amount of max dequeued images.
Reporter | ||
Comment 14•9 months ago
|
||
It seems necessary to move Codec proxy in GPU process and need to change how to call java::CodecProxy::ReleaseOutput() even when AImageReader is used.
Gecko calls CodecProxy::ReleaseOutput() from RemoteVideoDecoder in content process. It enqueues video frame to SurfaceTexture when it is decided to forward to GPU process via ImageContainer. Then SurfaceTexture has multiple enqueued video frames. It caused a problem of how to synchronize video rendering at WebRender. SurfaceTexture could render only front buffer(video frame). Then RenderAndroidSurfaceTextureHost becomes a bit complex and the video frame synchronization problem still exist. Since there are cases that video frames were dropped at WebRenderImageHost.
If we want to support WebCodecs on Android, we do not want to enqueue multiple video frames to SurfaceTexture. Otherwise, we could not select the target video frame correctly.
chromium actually does it. video frame is pushed to SurfaceTexture by calling CodecOutputBuffer::ReleaseToSurface().
There is a time gap from CodecOutputBuffer::ReleaseToSurface() until video frame available in SurfaceTexture. CodecBufferWaitCoordinator::WaitForFrameAvailable() does the wait.
Comment 15•9 days ago
|
||
Just a note that our rewrite to using C++ in the NDK, and consequently running the Android decoding operations from the GPU process has landed in https://bugzilla.mozilla.org/show_bug.cgi?id=1934009 and related bugs. It is enabled by default.
Description
•