Closed
Bug 1033903
Opened 10 years ago
Closed 10 years ago
Use GraphicBuffer with android::MediaCodec to reduce the overhead of memcpy video data.
Categories
(Core :: Audio/Video, defect)
Tracking
()
People
(Reporter: brsun, Assigned: brsun)
References
Details
Attachments
(1 file, 9 obsolete files)
27.76 KB,
patch
|
brsun
:
review+
|
Details | Diff | Splinter Review |
This is a followup bug of bug 904177. GraphicBuffer is needed to be used with android::MediaCodec to reduce the overhead of memcpy video data.
Comment 1•10 years ago
|
||
There seem three option to get GraphicBuffer from MediaCodec. I wrote about it at Bug 1009420 Comment 14.
Comment 2•10 years ago
|
||
(In reply to Sotaro Ikeda [:sotaro PTO July/25 - Aug/3] from comment #1)
> There seem three option to get GraphicBuffer from MediaCodec. I wrote about
> it at Bug 1009420 Comment 14.
And also Bug 1009420 Comment 15.
Updated•10 years ago
|
feature-b2g: --- → 2.1
Comment 3•10 years ago
|
||
depends on bug 1009410.
Updated•10 years ago
|
Assignee: nobody → brsun
Target Milestone: --- → 2.1 S3 (29aug)
Comment 4•10 years ago
|
||
Postpone to the next version, because it will be turned off by default anyway. Marvin, if you have strong preference to land it to 2.1, please feel free to change the feature-b2g flag back to 2.1. Thanks.
feature-b2g: 2.1 → 2.2?
Assignee | ||
Comment 5•10 years ago
|
||
WIP: Use GraphicBuffer in MediaCodecReader. Need to investigate why video jitters.
Assignee | ||
Comment 6•10 years ago
|
||
The reason why video jitters in wip.v1 (attachment 8500915 [details] [diff] [review]) is because the timing to release each TextureClient is wrong, which has been fixed in this patch.
Hi Sotaro and Blake, may I have your feedback regarding to the usage of MediaCodec and GraphicBuffer?
Attachment #8500915 -
Attachment is obsolete: true
Attachment #8504010 -
Flags: feedback?(sotaro.ikeda.g)
Attachment #8504010 -
Flags: feedback?(bwu)
Comment 7•10 years ago
|
||
Comment on attachment 8504010 [details] [diff] [review]
bug1033903_graphic_buffer_media_codec_reader.wip.v2.patch
Review of attachment 8504010 [details] [diff] [review]:
-----------------------------------------------------------------
Basically good! Bug Fence handling seems lacking in the patch.
::: content/media/omx/MediaCodecReader.cpp
@@ +372,5 @@
> }
> }
>
> void
> +MediaCodecReader::DispatchTextureClientRecycleTask(mozilla::layers::TextureClient* aClient)
Who use this function?
@@ +781,5 @@
> + }
> +}
> +
> +void
> +MediaCodecReader::RecycleTextureClient(mozilla::layers::TextureClient* aClient)
This function seems not handling android::Fence. On master flame-kk, when I touch display during youtube video playback, video rendering did flickering. It seems to be caused by lacking fence handling.
If this function could take longer time, actual task needed to be done in different thread. This function could be called on ImageBridge thread, we need not to block ImageBridge thread long time.
@@ +793,5 @@
> + mTextureClientIndexes.erase(it);
> + }
> +}
> +
> +void MediaCodecReader::RecycleAllTextureClients()
Normally our code should not release buffer here. This code should be just to prevent application crash on release rom. Even when buffers are returned by this way, the buffers could be used somewhere in gecko. On debug build, if there is sill a buffer that is not released yet, the app should be aborted to detect the problem.
Attachment #8504010 -
Flags: feedback?(sotaro.ikeda.g)
Comment 8•10 years ago
|
||
> @@ +793,5 @@
> > + mTextureClientIndexes.erase(it);
> > + }
> > +}
> > +
> > +void MediaCodecReader::RecycleAllTextureClients()
>
> Normally our code should not release buffer here. This code should be just
> to prevent application crash on release rom. Even when buffers are returned
> by this way, the buffers could be used somewhere in gecko. On debug build,
> if there is sill a buffer that is not released yet, the app should be
> aborted to detect the problem.
I added similar code to OmxDecoder. It is just to prevent application crash in release ROM.
Comment 9•10 years ago
|
||
Comment on attachment 8504010 [details] [diff] [review]
bug1033903_graphic_buffer_media_codec_reader.wip.v2.patch
Review of attachment 8504010 [details] [diff] [review]:
-----------------------------------------------------------------
It looks good to me for the overall. Previously I was not sure if fence is required or not, but Sotaro has pointed out that it is required. Good to know!
::: content/media/omx/MediaCodecReader.cpp
@@ +776,5 @@
> + void* aClosure)
> +{
> + MediaCodecReader* reader = static_cast<MediaCodecReader*>(aClosure);
> + if (reader != nullptr) {
> + reader->RecycleTextureClient(aClient);
Should you intend to use DispatchTextureClientRecycleTask here, not directly call RecycleTextureClient? If yes, this could avoiding the problem that RecycleTextureClient takes longer to block others as Sotaro mentioned.
Updated•10 years ago
|
Attachment #8504010 -
Flags: feedback?(bwu)
Assignee | ||
Comment 10•10 years ago
|
||
After trying and testing during these days, I found some strange things that I haven't figured out yet:
- If I defer |cancelBuffer()| and |releaseOutputBuffer()| to another thread than the one calls |RecycleCallback()|, the carried |TextureClient*| in |RecycleCallback()| may not in the same order as |dequeueOutputBuffer()|. This is really weird. (ex. dequeue no.0, no.1, no.2, no.3 buffers, recycle no.1, no.2, no.0, no.3 buffers)
- When the above situation happens, |releaseOutputBuffer()| always blocks no matter how we release |TextureClient*|:
= release in the same order of |RecycleCallback()| (ex. blocks on no.1 buffer)
= release in the same order of |dequeueOutputBuffer()| (ex. blocks on no.0 buffer no matter no.1 and no.2 have been recycled or not)
I am going to finish the basic integration of |GraphicBuffer| on this bug, and to separate the effort of deferring |cancelBuffer()| and |releaseOutputBuffer()| into another follow-up bug.
Assignee | ||
Comment 12•10 years ago
|
||
(In reply to Sotaro Ikeda [:sotaro] from comment #11)
> brsun, which patch cause Comment 10?
Oops, I've rollback my codes. I can re-generate one and upload it later.
Assignee | ||
Comment 13•10 years ago
|
||
Use FenceHandle for ANativeWindow::cancelBuffer().
BTW, the video flickering issue still exists...
Attachment #8511812 -
Flags: feedback?(sotaro.ikeda.g)
Assignee | ||
Comment 14•10 years ago
|
||
Comment on attachment 8504010 [details] [diff] [review]
bug1033903_graphic_buffer_media_codec_reader.wip.v2.patch
># HG changeset patch
># User Bruce Sun <brsun@mozilla.com>
># Date 1413196713 -28800
># Mon Oct 13 18:38:33 2014 +0800
># Node ID 7289b2188166951bfb00b42ef1f7a88aa376da68
># Parent c46a43412994bea324b9c5f49d832b3836efb79c
>Bug 1033903: Support GraphicBuffer in MediaCodecReader.
>
>diff --git a/content/media/omx/MediaCodecReader.cpp b/content/media/omx/MediaCodecReader.cpp
>--- a/content/media/omx/MediaCodecReader.cpp
>+++ b/content/media/omx/MediaCodecReader.cpp
>@@ -6,16 +6,18 @@
>
> #include "MediaCodecReader.h"
>
> #include <OMX_IVCommon.h>
>
> #include <gui/Surface.h>
> #include <ICrypto.h>
>
>+#include "GonkNativeWindow.h"
>+
> #include <stagefright/foundation/ABuffer.h>
> #include <stagefright/foundation/ADebug.h>
> #include <stagefright/foundation/ALooper.h>
> #include <stagefright/foundation/AMessage.h>
> #include <stagefright/MediaBuffer.h>
> #include <stagefright/MediaCodec.h>
> #include <stagefright/MediaDefs.h>
> #include <stagefright/MediaExtractor.h>
>@@ -125,18 +127,19 @@ MediaCodecReader::TrackInputCopier::Copy
> aCodecBuffer->setRange(0, aSourceBuffer->range_length());
> memcpy(aCodecBuffer->data(),
> (uint8_t*)aSourceBuffer->data() + aSourceBuffer->range_offset(),
> aSourceBuffer->range_length());
>
> return true;
> }
>
>-MediaCodecReader::Track::Track()
>- : mSourceIsStopped(true)
>+MediaCodecReader::Track::Track(Type type)
>+ : mType(type)
>+ , mSourceIsStopped(true)
> , mDurationLock("MediaCodecReader::Track::mDurationLock")
> , mDurationUs(INT64_C(0))
> , mInputIndex(sInvalidInputIndex)
> , mInputEndOfStream(false)
> , mOutputEndOfStream(false)
> , mSeekTimeUs(sInvalidTimestampUs)
> , mFlushed(false)
> , mDiscontinuity(false)
>@@ -168,21 +171,23 @@ MediaCodecReader::VorbisInputCopier::Cop
> aSourceBuffer->range_length());
> memcpy(aCodecBuffer->data() + aSourceBuffer->range_length(),
> &numPageSamples, sizeof(numPageSamples));
>
> return true;
> }
>
> MediaCodecReader::AudioTrack::AudioTrack()
>+ : Track(kAudio)
> {
> }
>
> MediaCodecReader::VideoTrack::VideoTrack()
>- : mWidth(0)
>+ : Track(kVideo)
>+ , mWidth(0)
> , mHeight(0)
> , mStride(0)
> , mSliceHeight(0)
> , mColorFormat(0)
> , mRotation(0)
> {
> }
>
>@@ -274,23 +279,24 @@ MediaCodecReader::ProcessCachedDataTask:
> nsRefPtr<ReferenceKeeperRunnable<MediaCodecReader>> runnable(
> new ReferenceKeeperRunnable<MediaCodecReader>(mReader));
> mReader = nullptr;
> NS_DispatchToMainThread(runnable.get());
> }
>
> MediaCodecReader::MediaCodecReader(AbstractMediaDecoder* aDecoder)
> : MediaOmxCommonReader(aDecoder)
>+ , mExtractor(nullptr)
>+ , mIsWaitingResources(false)
>+ , mTextureClientIndexesLock("MediaCodecReader::mTextureClientIndexesLock")
> , mColorConverterBufferSize(0)
>- , mExtractor(nullptr)
> , mParserMonitor("MediaCodecReader::mParserMonitor")
> , mParseDataFromCache(true)
> , mNextParserPosition(INT64_C(0))
> , mParsedDataLength(INT64_C(0))
>- , mIsWaitingResources(false)
> {
> mHandler = new MessageHandler(this);
> mVideoListener = new VideoResourceListener(this);
> }
>
> MediaCodecReader::~MediaCodecReader()
> {
> MOZ_ASSERT(NS_IsMainThread(), "Should be on main thread.");
>@@ -362,16 +368,28 @@ MediaCodecReader::DispatchVideoTask(int6
> NS_NewRunnableMethodWithArg<int64_t>(this,
> &MediaCodecReader::DecodeVideoFrameTask,
> aTimeThreshold);
> mVideoTrack.mTaskQueue->Dispatch(task);
> }
> }
>
> void
>+MediaCodecReader::DispatchTextureClientRecycleTask(mozilla::layers::TextureClient* aClient)
>+{
>+ if (mVideoTrack.mTaskQueue) {
>+ RefPtr<nsIRunnable> task =
>+ NS_NewRunnableMethodWithArg<mozilla::layers::TextureClient*>(this,
>+ &MediaCodecReader::RecycleTextureClient,
>+ aClient);
>+ mVideoTrack.mTaskQueue->Dispatch(task);
>+ }
>+}
>+
>+void
> MediaCodecReader::RequestAudioData()
> {
> MOZ_ASSERT(GetTaskQueue()->IsCurrentThreadIn());
> MOZ_ASSERT(HasAudio());
> if (CheckAudioResources()) {
> DispatchAudioTask();
> }
> }
>@@ -740,23 +758,60 @@ MediaCodecReader::ResetDecode()
> {
> if (CheckAudioResources()) {
> mAudioTrack.mTaskQueue->Flush();
> FlushCodecData(mAudioTrack);
> mAudioTrack.mDiscontinuity = true;
> }
> if (CheckVideoResources()) {
> mVideoTrack.mTaskQueue->Flush();
>+ RecycleAllTextureClients();
> FlushCodecData(mVideoTrack);
> mVideoTrack.mDiscontinuity = true;
> }
>
> return MediaDecoderReader::ResetDecode();
> }
>
>+void
>+MediaCodecReader::TextureClientRecycleCallback(mozilla::layers::TextureClient* aClient,
>+ void* aClosure)
>+{
>+ MediaCodecReader* reader = static_cast<MediaCodecReader*>(aClosure);
>+ if (reader != nullptr) {
>+ reader->RecycleTextureClient(aClient);
>+ }
>+}
>+
>+void
>+MediaCodecReader::RecycleTextureClient(mozilla::layers::TextureClient* aClient)
>+{
>+ MutexAutoLock al(mTextureClientIndexesLock);
>+ std::map<mozilla::layers::TextureClient*, size_t>::iterator it = mTextureClientIndexes.find(aClient);
>+ if (it != mTextureClientIndexes.end()) {
>+ if (mVideoTrack.mCodec != nullptr) {
>+ mVideoTrack.mCodec->releaseOutputBuffer((*it).second);
>+ }
>+ mTextureClientIndexes.erase(it);
>+ }
>+}
>+
>+void MediaCodecReader::RecycleAllTextureClients()
>+{
>+ MutexAutoLock al(mTextureClientIndexesLock);
>+ if (mVideoTrack.mCodec != nullptr) {
>+ for (std::map<mozilla::layers::TextureClient*, size_t>::iterator it = mTextureClientIndexes.begin();
>+ it != mTextureClientIndexes.end();
>+ ++it) {
>+ mVideoTrack.mCodec->releaseOutputBuffer((*it).second);
>+ }
>+ }
>+ mTextureClientIndexes.clear();
>+}
>+
> bool
> MediaCodecReader::DecodeVideoFrameSync(int64_t aTimeThreshold)
> {
> if (mVideoTrack.mCodec == nullptr || !mVideoTrack.mCodec->allocated() ||
> mVideoTrack.mOutputEndOfStream) {
> return false;
> }
>
>@@ -790,99 +845,123 @@ MediaCodecReader::DecodeVideoFrameSync(i
> return false;
> }
> } else {
> return false;
> }
> }
>
> bool result = false;
>- if (bufferInfo.mBuffer != nullptr && bufferInfo.mSize > 0 &&
>- bufferInfo.mBuffer->data() != nullptr) {
>- uint8_t* yuv420p_buffer = bufferInfo.mBuffer->data();
>- int32_t stride = mVideoTrack.mStride;
>- int32_t slice_height = mVideoTrack.mSliceHeight;
>-
>- // Converts to OMX_COLOR_FormatYUV420Planar
>- if (mVideoTrack.mColorFormat != OMX_COLOR_FormatYUV420Planar) {
>- ARect crop;
>- crop.top = 0;
>- crop.bottom = mVideoTrack.mHeight;
>- crop.left = 0;
>- crop.right = mVideoTrack.mWidth;
>-
>- yuv420p_buffer = GetColorConverterBuffer(mVideoTrack.mWidth,
>- mVideoTrack.mHeight);
>- if (mColorConverter.convertDecoderOutputToI420(
>- bufferInfo.mBuffer->data(), mVideoTrack.mWidth, mVideoTrack.mHeight,
>- crop, yuv420p_buffer) != OK) {
>- mVideoTrack.mCodec->releaseOutputBuffer(bufferInfo.mIndex);
>- NS_WARNING("Unable to convert color format");
>- return false;
>- }
>-
>- stride = mVideoTrack.mWidth;
>- slice_height = mVideoTrack.mHeight;
>- }
>-
>- size_t yuv420p_y_size = stride * slice_height;
>- size_t yuv420p_u_size = ((stride + 1) / 2) * ((slice_height + 1) / 2);
>- uint8_t* yuv420p_y = yuv420p_buffer;
>- uint8_t* yuv420p_u = yuv420p_y + yuv420p_y_size;
>- uint8_t* yuv420p_v = yuv420p_u + yuv420p_u_size;
>-
>+ VideoData *v = nullptr;
>+ RefPtr<mozilla::layers::TextureClient> textureClient;
>+ if (bufferInfo.mBuffer != nullptr) {
> // This is the approximate byte position in the stream.
> int64_t pos = mDecoder->GetResource()->Tell();
>
>- VideoData::YCbCrBuffer b;
>- b.mPlanes[0].mData = yuv420p_y;
>- b.mPlanes[0].mWidth = mVideoTrack.mWidth;
>- b.mPlanes[0].mHeight = mVideoTrack.mHeight;
>- b.mPlanes[0].mStride = stride;
>- b.mPlanes[0].mOffset = 0;
>- b.mPlanes[0].mSkip = 0;
>+ sp<GraphicBuffer> graphicBuffer;
>+ if (mVideoTrack.mNativeWindow != nullptr &&
>+ mVideoTrack.mCodec->getOutputGraphicBufferFromIndex(bufferInfo.mIndex, &graphicBuffer) == OK &&
>+ graphicBuffer != nullptr) {
>+ textureClient = mVideoTrack.mNativeWindow->getTextureClientFromBuffer(graphicBuffer.get());
>+ v = VideoData::Create(mInfo.mVideo,
>+ mDecoder->GetImageContainer(),
>+ pos,
>+ bufferInfo.mTimeUs,
>+ 1, // We don't know the duration.
>+ textureClient,
>+ bufferInfo.mFlags & MediaCodec::BUFFER_FLAG_SYNCFRAME,
>+ -1,
>+ mVideoTrack.mRelativePictureRect);
>+ } else if (bufferInfo.mSize > 0 &&
>+ bufferInfo.mBuffer->data() != nullptr) {
>+ uint8_t* yuv420p_buffer = bufferInfo.mBuffer->data();
>+ int32_t stride = mVideoTrack.mStride;
>+ int32_t slice_height = mVideoTrack.mSliceHeight;
>
>- b.mPlanes[1].mData = yuv420p_u;
>- b.mPlanes[1].mWidth = (mVideoTrack.mWidth + 1) / 2;
>- b.mPlanes[1].mHeight = (mVideoTrack.mHeight + 1) / 2;
>- b.mPlanes[1].mStride = (stride + 1) / 2;
>- b.mPlanes[1].mOffset = 0;
>- b.mPlanes[1].mSkip = 0;
>+ // Converts to OMX_COLOR_FormatYUV420Planar
>+ if (mVideoTrack.mColorFormat != OMX_COLOR_FormatYUV420Planar) {
>+ ARect crop;
>+ crop.top = 0;
>+ crop.bottom = mVideoTrack.mHeight;
>+ crop.left = 0;
>+ crop.right = mVideoTrack.mWidth;
>
>- b.mPlanes[2].mData = yuv420p_v;
>- b.mPlanes[2].mWidth =(mVideoTrack.mWidth + 1) / 2;
>- b.mPlanes[2].mHeight = (mVideoTrack.mHeight + 1) / 2;
>- b.mPlanes[2].mStride = (stride + 1) / 2;
>- b.mPlanes[2].mOffset = 0;
>- b.mPlanes[2].mSkip = 0;
>+ yuv420p_buffer = GetColorConverterBuffer(mVideoTrack.mWidth,
>+ mVideoTrack.mHeight);
>+ if (mColorConverter.convertDecoderOutputToI420(
>+ bufferInfo.mBuffer->data(), mVideoTrack.mWidth, mVideoTrack.mHeight,
>+ crop, yuv420p_buffer) != OK) {
>+ mVideoTrack.mCodec->releaseOutputBuffer(bufferInfo.mIndex);
>+ NS_WARNING("Unable to convert color format");
>+ return false;
>+ }
>
>- VideoData *v = VideoData::Create(
>- mInfo.mVideo,
>- mDecoder->GetImageContainer(),
>- pos,
>- bufferInfo.mTimeUs,
>- 1, // We don't know the duration.
>- b,
>- bufferInfo.mFlags & MediaCodec::BUFFER_FLAG_SYNCFRAME,
>- -1,
>- mVideoTrack.mRelativePictureRect);
>+ stride = mVideoTrack.mWidth;
>+ slice_height = mVideoTrack.mHeight;
>+ }
>+
>+ size_t yuv420p_y_size = stride * slice_height;
>+ size_t yuv420p_u_size = ((stride + 1) / 2) * ((slice_height + 1) / 2);
>+ uint8_t* yuv420p_y = yuv420p_buffer;
>+ uint8_t* yuv420p_u = yuv420p_y + yuv420p_y_size;
>+ uint8_t* yuv420p_v = yuv420p_u + yuv420p_u_size;
>+
>+ VideoData::YCbCrBuffer b;
>+ b.mPlanes[0].mData = yuv420p_y;
>+ b.mPlanes[0].mWidth = mVideoTrack.mWidth;
>+ b.mPlanes[0].mHeight = mVideoTrack.mHeight;
>+ b.mPlanes[0].mStride = stride;
>+ b.mPlanes[0].mOffset = 0;
>+ b.mPlanes[0].mSkip = 0;
>+
>+ b.mPlanes[1].mData = yuv420p_u;
>+ b.mPlanes[1].mWidth = (mVideoTrack.mWidth + 1) / 2;
>+ b.mPlanes[1].mHeight = (mVideoTrack.mHeight + 1) / 2;
>+ b.mPlanes[1].mStride = (stride + 1) / 2;
>+ b.mPlanes[1].mOffset = 0;
>+ b.mPlanes[1].mSkip = 0;
>+
>+ b.mPlanes[2].mData = yuv420p_v;
>+ b.mPlanes[2].mWidth =(mVideoTrack.mWidth + 1) / 2;
>+ b.mPlanes[2].mHeight = (mVideoTrack.mHeight + 1) / 2;
>+ b.mPlanes[2].mStride = (stride + 1) / 2;
>+ b.mPlanes[2].mOffset = 0;
>+ b.mPlanes[2].mSkip = 0;
>+
>+ v = VideoData::Create(mInfo.mVideo,
>+ mDecoder->GetImageContainer(),
>+ pos,
>+ bufferInfo.mTimeUs,
>+ 1, // We don't know the duration.
>+ b,
>+ bufferInfo.mFlags & MediaCodec::BUFFER_FLAG_SYNCFRAME,
>+ -1,
>+ mVideoTrack.mRelativePictureRect);
>+ }
>
> if (v) {
> result = true;
> VideoQueue().Push(v);
> } else {
> NS_WARNING("Unable to create VideoData");
> }
> }
>
> if ((bufferInfo.mFlags & MediaCodec::BUFFER_FLAG_EOS) ||
> (status == ERROR_END_OF_STREAM)) {
> VideoQueue().Finish();
> }
>- mVideoTrack.mCodec->releaseOutputBuffer(bufferInfo.mIndex);
>+
>+ if (v != nullptr && textureClient != nullptr && result) {
>+ MutexAutoLock al(mTextureClientIndexesLock);
>+ mTextureClientIndexes[textureClient.get()] = bufferInfo.mIndex;
>+ textureClient->SetRecycleCallback(MediaCodecReader::TextureClientRecycleCallback, this);
>+ } else {
>+ mVideoTrack.mCodec->releaseOutputBuffer(bufferInfo.mIndex);
>+ }
>
> return result;
> }
>
> nsresult
> MediaCodecReader::Seek(int64_t aTime,
> int64_t aStartTime,
> int64_t aEndTime,
>@@ -1196,79 +1275,92 @@ MediaCodecReader::CreateMediaCodec(sp<AL
> }
>
> if (!strcasecmp(mime, MEDIA_MIMETYPE_AUDIO_VORBIS)) {
> aTrack.mInputCopier = new VorbisInputCopier;
> } else {
> aTrack.mInputCopier = new TrackInputCopier;
> }
>
>+ uint32_t capability = MediaCodecProxy::kEmptyCapability;
>+ if (aTrack.mType == Track::kVideo &&
>+ aTrack.mCodec->getCapability(&capability) == OK &&
>+ (capability & MediaCodecProxy::kCanExposeGraphicBuffer) == MediaCodecProxy::kCanExposeGraphicBuffer) {
>+ aTrack.mNativeWindow = new GonkNativeWindow();
>+ }
>+
> if (!aAsync) {
> // Pending configure() and start() to codecReserved() if the creation
> // should be asynchronous.
> if (!aTrack.mCodec->allocated() || !ConfigureMediaCodec(aTrack)){
> NS_WARNING("Couldn't create and configure MediaCodec synchronously");
>- aTrack.mCodec = nullptr;
>+ DestroyMediaCodec(aTrack);
> return false;
> }
> }
> }
>
> return true;
> }
>
> bool
> MediaCodecReader::ConfigureMediaCodec(Track& aTrack)
> {
> if (aTrack.mSource != nullptr && aTrack.mCodec != nullptr) {
> if (!aTrack.mCodec->allocated()) {
> return false;
> }
>
>+ sp<Surface> surface;
>+ if (aTrack.mNativeWindow != nullptr) {
>+ surface = new Surface(aTrack.mNativeWindow->getBufferQueue());
>+ }
>+
> sp<MetaData> sourceFormat = aTrack.mSource->getFormat();
> sp<AMessage> codecFormat;
> convertMetaDataToMessage(sourceFormat, &codecFormat);
>
> bool allpass = true;
>- if (allpass && aTrack.mCodec->configure(codecFormat, nullptr, nullptr, 0) != OK) {
>+ if (allpass && aTrack.mCodec->configure(codecFormat, surface, nullptr, 0) != OK) {
> NS_WARNING("Couldn't configure MediaCodec");
> allpass = false;
> }
> if (allpass && aTrack.mCodec->start() != OK) {
> NS_WARNING("Couldn't start MediaCodec");
> allpass = false;
> }
> if (allpass && aTrack.mCodec->getInputBuffers(&aTrack.mInputBuffers) != OK) {
> NS_WARNING("Couldn't get input buffers from MediaCodec");
> allpass = false;
> }
> if (allpass && aTrack.mCodec->getOutputBuffers(&aTrack.mOutputBuffers) != OK) {
> NS_WARNING("Couldn't get output buffers from MediaCodec");
> allpass = false;
> }
> if (!allpass) {
>- aTrack.mCodec = nullptr;
>+ DestroyMediaCodec(aTrack);
> return false;
> }
> }
>
> return true;
> }
>
> void
> MediaCodecReader::DestroyMediaCodecs()
> {
>- DestroyMediaCodecs(mAudioTrack);
>- DestroyMediaCodecs(mVideoTrack);
>+ DestroyMediaCodec(mAudioTrack);
>+ DestroyMediaCodec(mVideoTrack);
> }
>
> void
>-MediaCodecReader::DestroyMediaCodecs(Track& aTrack)
>+MediaCodecReader::DestroyMediaCodec(Track& aTrack)
> {
> aTrack.mCodec = nullptr;
>+ aTrack.mNativeWindow = nullptr;
> }
>
> bool
> MediaCodecReader::TriggerIncrementalParser()
> {
> if (mMetaData == nullptr) {
> return false;
> }
>@@ -1654,17 +1746,16 @@ MediaCodecReader::GetCodecOutputData(Tra
> break;
> } else {
> aTrack.mCodec->releaseOutputBuffer(info.mIndex);
> }
> } else if (status == INFO_OUTPUT_BUFFERS_CHANGED) {
> // Update output buffers of MediaCodec.
> if (aTrack.mCodec->getOutputBuffers(&aTrack.mOutputBuffers) != OK) {
> NS_WARNING("Couldn't get output buffers from MediaCodec");
>- aTrack.mCodec = nullptr;
> return UNKNOWN_ERROR;
> }
> }
>
> if (TimeStamp::Now() > aTimeout) {
> // Don't let this loop run for too long. Try it again later.
> return -EAGAIN;
> }
>@@ -1770,32 +1861,32 @@ MediaCodecReader::onMessageReceived(cons
> }
> }
>
> // Called on Binder thread.
> void
> MediaCodecReader::codecReserved(Track& aTrack)
> {
> if (!ConfigureMediaCodec(aTrack)) {
>- DestroyMediaCodecs(aTrack);
>+ DestroyMediaCodec(aTrack);
> return;
> }
>
> if (mHandler != nullptr) {
> // post kNotifyCodecReserved to MediaCodecReader::mLooper thread.
> sp<AMessage> notify = new AMessage(kNotifyCodecReserved, mHandler->id());
> notify->post();
> }
> }
>
> // Called on Binder thread.
> void
> MediaCodecReader::codecCanceled(Track& aTrack)
> {
>- DestroyMediaCodecs(aTrack);
>+ DestroyMediaCodec(aTrack);
>
> if (mHandler != nullptr) {
> // post kNotifyCodecCanceled to MediaCodecReader::mLooper thread.
> sp<AMessage> notify = new AMessage(kNotifyCodecCanceled, mHandler->id());
> notify->post();
> }
> }
>
>diff --git a/content/media/omx/MediaCodecReader.h b/content/media/omx/MediaCodecReader.h
>--- a/content/media/omx/MediaCodecReader.h
>+++ b/content/media/omx/MediaCodecReader.h
>@@ -16,31 +16,39 @@
> #include <mozilla/Monitor.h>
>
> #include "MediaData.h"
>
> #include "I420ColorConverterHelper.h"
> #include "MediaCodecProxy.h"
> #include "MediaOmxCommonReader.h"
>
>+#include <map>
>+
> namespace android {
> struct ALooper;
> struct AMessage;
>
> class MOZ_EXPORT MediaExtractor;
> class MOZ_EXPORT MetaData;
> class MOZ_EXPORT MediaBuffer;
> struct MOZ_EXPORT MediaSource;
>+
>+class GonkNativeWindow;
> } // namespace android
>
> namespace mozilla {
>
> class MediaTaskQueue;
> class MP3FrameParser;
>
>+namespace layers {
>+class TextureClient;
>+} // namespace mozilla::layers
>+
> class MediaCodecReader : public MediaOmxCommonReader
> {
> public:
> MediaCodecReader(AbstractMediaDecoder* aDecoder);
> virtual ~MediaCodecReader();
>
> // Initializes the reader, returns NS_OK on success, or NS_ERROR_FAILURE
> // on failure.
>@@ -102,24 +110,34 @@ protected:
> struct TrackInputCopier
> {
> virtual bool Copy(android::MediaBuffer* aSourceBuffer,
> android::sp<android::ABuffer> aCodecBuffer);
> };
>
> struct Track
> {
>- Track();
>+ enum Type
>+ {
>+ kUnknown = 0,
>+ kAudio,
>+ kVideo,
>+ };
>+
>+ Track(Type type=kUnknown);
>+
>+ const Type mType;
>
> // pipeline parameters
> android::sp<android::MediaSource> mSource;
> bool mSourceIsStopped;
> android::sp<android::MediaCodecProxy> mCodec;
> android::Vector<android::sp<android::ABuffer> > mInputBuffers;
> android::Vector<android::sp<android::ABuffer> > mOutputBuffers;
>+ android::sp<android::GonkNativeWindow> mNativeWindow;
>
> // pipeline copier
> nsAutoPtr<TrackInputCopier> mInputCopier;
>
> // media parameters
> Mutex mDurationLock; // mDurationUs might be read or updated from multiple
> // threads.
> int64_t mDurationUs;
>@@ -365,26 +383,27 @@ private:
>
> bool CreateMediaCodecs();
> static bool CreateMediaCodec(android::sp<android::ALooper>& aLooper,
> Track& aTrack,
> bool aAsync,
> android::wp<android::MediaCodecProxy::CodecResourceListener> aListener);
> static bool ConfigureMediaCodec(Track& aTrack);
> void DestroyMediaCodecs();
>- static void DestroyMediaCodecs(Track& aTrack);
>+ static void DestroyMediaCodec(Track& aTrack);
>
> bool CreateTaskQueues();
> void ShutdownTaskQueues();
> bool DecodeVideoFrameTask(int64_t aTimeThreshold);
> bool DecodeVideoFrameSync(int64_t aTimeThreshold);
> bool DecodeAudioDataTask();
> bool DecodeAudioDataSync();
> void DispatchVideoTask(int64_t aTimeThreshold);
> void DispatchAudioTask();
>+ void DispatchTextureClientRecycleTask(mozilla::layers::TextureClient* aClient);
> inline bool CheckVideoResources() {
> return (HasVideo() && mVideoTrack.mSource != nullptr &&
> mVideoTrack.mTaskQueue);
> }
>
> inline bool CheckAudioResources() {
> return (HasAudio() && mAudioTrack.mSource != nullptr &&
> mAudioTrack.mTaskQueue);
>@@ -408,22 +427,29 @@ private:
> void ClearColorConverterBuffer();
>
> int64_t ProcessCachedData(int64_t aOffset,
> nsRefPtr<SignalObject> aSignal);
> bool ParseDataSegment(const char* aBuffer,
> uint32_t aLength,
> int64_t aOffset);
>
>+ static void TextureClientRecycleCallback(mozilla::layers::TextureClient* aClient, void* aClosure);
>+ void RecycleTextureClient(mozilla::layers::TextureClient* aClient);
>+ void RecycleAllTextureClients();
>+
> android::sp<MessageHandler> mHandler;
> android::sp<VideoResourceListener> mVideoListener;
>
> android::sp<android::ALooper> mLooper;
> android::sp<android::MetaData> mMetaData;
>
>+ Mutex mTextureClientIndexesLock;
>+ std::map<mozilla::layers::TextureClient*, size_t> mTextureClientIndexes;
>+
> // media tracks
> AudioTrack mAudioTrack;
> VideoTrack mVideoTrack;
> AudioTrack mAudioOffloadTrack; // only Track::mSource is valid
>
> // color converter
> android::I420ColorConverterHelper mColorConverter;
> nsAutoArrayPtr<uint8_t> mColorConverterBuffer;
Attachment #8504010 -
Attachment is obsolete: true
Flags: needinfo?(brsun)
Assignee | ||
Comment 15•10 years ago
|
||
This is the one path that defers cancelBuffer() and releaseOutputBuffer() onto another thread.
releaseOutputBuffer() sometimes blocks as described in comment 10.
Attachment #8511813 -
Flags: feedback?(sotaro.ikeda.g)
Assignee | ||
Comment 16•10 years ago
|
||
(In reply to Bruce Sun [:brsun] from comment #14)
well...please ignore this comment 14...
I just want to set obsolete on attachment 8504010 [details] [diff] [review] but pollute the comment instead.
Comment 17•10 years ago
|
||
Current way of sent fence to ANativeWindow(GonkBufferQueue) seems to have a problem. That way seems to work only for OMXCodec.
On OMXCodec, MediaBuffer is rendered via OMXCodec's owner. Therefore, when MediaBuffer has kKeyRendered meta data, OMXCodec thinks that MediaBuffer is rendered via the client and GraphicBuffer is owned by ANativeWindow. OmxDecoder's ANAtiveWindow::cancelBuffer() call moves the buffer ownership to ANativeWindow. Then, ANAtiveWindow::cancelBuffer() is not called in OMXCodec::signalBufferReturned(MediaBuffer *buffer). The signalBufferReturned() is called when MediaBuffer's reference count becomes 0. The status consistency exists between ANAtiveWindow and OMXCodec.
http://androidxref.com/4.4.4_r1/xref/frameworks/av/media/libstagefright/OMXCodec.cpp#3907
But on MediaCodec, all buffers are rendered via MediaCodec, there seems not same such trick exist. Current code call ANativeWindow::cancelBuffer() by MediaCodecReader then calls MediaCodec::releaseOutputBuffer(). In MediaCodec::releaseOutputBuffer() case, buffer is not returned to ANativeWindow.
MediaCodec::releaseOutputBuffer()
->new AMessage(kWhatReleaseOutputBuffer, id());
->MediaCodec::PostAndAwaitResponse()
//scheduler
->MediaCodec::onReleaseOutputBuffer()
// Buffer is re-used by MediaCodec, the buffer does not return to ANativeWindow.
->AMessage::post()
Therefore, fence is not handled at all for re-using buffer. By the way, when MediaCodec::renderOutputBufferAndRelease() is called, buffer is sent to ANativeWindow by queueBuffer().
It seems that we need to find a way to handle fence correctly.
Comment 18•10 years ago
|
||
In current code, buffer owning status between ACodec and ANAtiveWindow seems to become inconsistent.
Updated•10 years ago
|
Attachment #8511812 -
Flags: feedback?(sotaro.ikeda.g)
Updated•10 years ago
|
Attachment #8511813 -
Flags: feedback?(sotaro.ikeda.g)
Comment 19•10 years ago
|
||
(In reply to Bruce Sun [:brsun] from comment #10)
> After trying and testing during these days, I found some strange things that
> I haven't figured out yet:
> - If I defer |cancelBuffer()| and |releaseOutputBuffer()| to another thread
> than the one calls |RecycleCallback()|, the carried |TextureClient*| in
> |RecycleCallback()| may not in the same order as |dequeueOutputBuffer()|.
> This is really weird. (ex. dequeue no.0, no.1, no.2, no.3 buffers, recycle
> no.1, no.2, no.0, no.3 buffers)
> - When the above situation happens, |releaseOutputBuffer()| always blocks
> no matter how we release |TextureClient*|:
> = release in the same order of |RecycleCallback()| (ex. blocks on no.1
> buffer)
> = release in the same order of |dequeueOutputBuffer()| (ex. blocks on no.0
> buffer no matter no.1 and no.2 have been recycled or not)
>
> I am going to finish the basic integration of |GraphicBuffer| on this bug,
> and to separate the effort of deferring |cancelBuffer()| and
> |releaseOutputBuffer()| into another follow-up bug.
I feel that there seems no problem when the RecycleCallback() and dequeueOutputBuffer()'s buffer order becomes different. Do you know why the problem happens? Comment 10 seems not explain about why block happens.
Assignee | ||
Comment 20•10 years ago
|
||
(In reply to Sotaro Ikeda [:sotaro] from comment #19)
> I feel that there seems no problem when the RecycleCallback() and
> dequeueOutputBuffer()'s buffer order becomes different. Do you know why the
> problem happens? Comment 10 seems not explain about why block happens.
You are right. I haven't dig into the root cause of blocking yet.
Just curious about why orders are different, and want to divide this bug with finer granularity.
Comment 21•10 years ago
|
||
One possibility is the following.
- When a video is just started playback and fist frame is posted to ImageContainer, a connection between ImageClientSingle and ImageHost is often not yet constructed. It might affect to it.
https://github.com/sotaroikeda/firefox-diagrams/blob/master/gfx/gfx_ImageBridge_FirefoxOS_2_1.pdf
Comment 22•10 years ago
|
||
To fix fence handling, modifying ACodec or GonkBufferQueue could be a choice. Most simplest solution seems better.
Assignee | ||
Comment 23•10 years ago
|
||
Integrate GraphicBuffer into MediaCodecReader.
There are known issues which should be dealt with:
- Defer releaseOutputBuffer() to another thread.
- Handle fence properly.
Attachment #8511812 -
Attachment is obsolete: true
Attachment #8511813 -
Attachment is obsolete: true
Attachment #8514104 -
Flags: review?(sotaro.ikeda.g)
Attachment #8514104 -
Flags: review?(cpearce)
Assignee | ||
Comment 24•10 years ago
|
||
(In reply to Bruce Sun [:brsun] from comment #23)
> There are known issues which should be dealt with:
Two corresponding follow-up bugs are created:
> - Defer releaseOutputBuffer() to another thread.
bug 1091466
> - Handle fence properly.
bug 1091467
Comment 25•10 years ago
|
||
Comment on attachment 8514104 [details] [diff] [review]
bug1033903_graphic_buffer_media_codec_reader.v4.patch
Review of attachment 8514104 [details] [diff] [review]:
-----------------------------------------------------------------
Overall it looks fine, but we should not be adding code that uses std::map (unfortunately), we need to use nsTHashtable...
::: dom/media/omx/MediaCodecReader.cpp
@@ +762,5 @@
> +void
> +MediaCodecReader::TextureClientRecycleCallback(mozilla::layers::TextureClient* aClient,
> + void* aClosure)
> +{
> + MediaCodecReader* reader = static_cast<MediaCodecReader*>(aClosure);
How do you ensure that this MediaCodecReader pointer remains valid? Can the MediaCodecReader be destroyed while this callback retains the MediaCodecReader pointer?
Do you need to AddRef the MediaCodecReader while the callback is still active?
@@ +797,5 @@
> +
> +void
> +MediaCodecReader::RecycleAllTextureClients()
> +{
> + std::map<mozilla::layers::TextureClient*, size_t> indexes;
You should also be nsTHashtable, or one of its subclasses. Probably what you want is:
nsDataHashtable<nsPtrHashKey<mozilla::layers::TextureClient>, size_t>
::: dom/media/omx/MediaCodecReader.h
@@ +22,5 @@
> #include "I420ColorConverterHelper.h"
> #include "MediaCodecProxy.h"
> #include "MediaOmxCommonReader.h"
>
> +#include <map>
You should be using nsTHashtable or its derivatives here.
@@ +422,1 @@
> CodecBufferInfo& aBuffer,
Indentation is incorrect here.
@@ +444,5 @@
> android::sp<android::ALooper> mLooper;
> android::sp<android::MetaData> mMetaData;
>
> + Mutex mTextureClientIndexesLock;
> + std::map<mozilla::layers::TextureClient*, size_t> mTextureClientIndexes;
You should also be nsTHashtable, or one of its subclasses. Probably what you want is:
nsDataHashtable<nsPtrHashKey<mozilla::layers::TextureClient>, size_t>
You might want to typedef this for convenience.
Attachment #8514104 -
Flags: review?(cpearce) → review-
Assignee | ||
Comment 26•10 years ago
|
||
A refined patch based on comment 25.
Attachment #8514104 -
Attachment is obsolete: true
Attachment #8514104 -
Flags: review?(sotaro.ikeda.g)
Attachment #8518003 -
Flags: review?(sotaro.ikeda.g)
Attachment #8518003 -
Flags: review?(cpearce)
Assignee | ||
Comment 27•10 years ago
|
||
Sorry, this one should be the refined patch instead.
Attachment #8518003 -
Attachment is obsolete: true
Attachment #8518003 -
Flags: review?(sotaro.ikeda.g)
Attachment #8518003 -
Flags: review?(cpearce)
Attachment #8518007 -
Flags: review?(sotaro.ikeda.g)
Attachment #8518007 -
Flags: review?(cpearce)
Comment 28•10 years ago
|
||
Comment on attachment 8518007 [details] [diff] [review]
bug1033903_graphic_buffer_media_codec_reader.v5.fix.patch
Review of attachment 8518007 [details] [diff] [review]:
-----------------------------------------------------------------
::: dom/media/omx/MediaCodecReader.h
@@ +121,5 @@
> {
> + enum Type
> + {
> + kUnknown = 0,
> + kAudio,
You could use MediaData::Type here instead, to avoid re-creating this enum here?
And kUnknown is unused.
(And yes, other code (including code I wrote!) also re-define track types, they should change in future)
Attachment #8518007 -
Flags: review?(cpearce) → review+
Comment 29•10 years ago
|
||
Comment on attachment 8518007 [details] [diff] [review]
bug1033903_graphic_buffer_media_codec_reader.v5.fix.patch
Review of attachment 8518007 [details] [diff] [review]:
-----------------------------------------------------------------
When we used gralloc buffers directly, the gralloc buffers are used directly outside of MediaCodecReader. From the patch it is not clear when RecycleAllTextureClients() is called, all the gralloc usage outside of MediaCodecReader are also released. And I prefer to limit RecycleAllTextureClients() as less as possible.
::: dom/media/omx/MediaCodecReader.cpp
@@ +1644,5 @@
> status_t
> MediaCodecReader::FlushCodecData(Track& aTrack)
> {
> + if (aTrack.mType == Track::kVideo) {
> + RecycleAllTextureClients();
Does it ensure that recycled TextureClients are not used out side of MediaCodecReader? If the TextureClient is rendered to Display, video rendering result could be broken when the recycled TextureClients are used for decoding new video frames.
In OmxDecoder, same thing is used only in OmxDecoder::ReleaseMediaResources(), just to prevent crash in release build as a last resort.
If we want that the TextureClients be recycled safely, the following functions needs to be called. From the patch, it was not clear.
- MediaCodecReader::ResetDecode()
(MediaDecoderReader::ResetDecode();)
- VideoFrameContainer::ClearCurrentFrame()
Updated•10 years ago
|
Attachment #8518007 -
Flags: review?(sotaro.ikeda.g)
Updated•10 years ago
|
Flags: needinfo?(brsun)
Assignee | ||
Comment 30•10 years ago
|
||
Hi Sotaro,
Thanks for reminding me. Concerns are addressed in this patch.
Attachment #8518007 -
Attachment is obsolete: true
Flags: needinfo?(brsun)
Attachment #8520446 -
Flags: review?(sotaro.ikeda.g)
Assignee | ||
Comment 31•10 years ago
|
||
Some corrections on TODO comments.
Attachment #8520446 -
Attachment is obsolete: true
Attachment #8520446 -
Flags: review?(sotaro.ikeda.g)
Attachment #8520461 -
Flags: review?(sotaro.ikeda.g)
Comment 32•10 years ago
|
||
Comment on attachment 8520461 [details] [diff] [review]
bug1033903_graphic_buffer_media_codec_reader.v6.fix.patch
Review of attachment 8520461 [details] [diff] [review]:
-----------------------------------------------------------------
becomes better:-) review+ if the following comment are addressed.
::: dom/media/omx/MediaCodecReader.cpp
@@ +1088,5 @@
> VideoFrameContainer* videoframe = mDecoder->GetVideoFrameContainer();
> if (videoframe) {
> videoframe->ClearCurrentFrame();
> }
> + ReleaseAllTextureClients();
On debug build, if ReleaseAllTextureClients() released TextureClient, it should abort(assert) and on release build it should output error message like OmxDecoder::ReleaseMediaResources(). Because videoframe->ClearCurrentFrame() and ResetDecode() should release all gralloc buffers outside of MediaCodecReader. otherwise its problem cause other problem like video rendering tearing.
Attachment #8520461 -
Flags: review?(sotaro.ikeda.g) → review+
Assignee | ||
Comment 33•10 years ago
|
||
TBPL results:
- media.omx.async.enabled pref. off: https://tbpl.mozilla.org/?tree=Try&rev=1fdc8a5d14a1
- media.omx.async.enabled pref. on: https://tbpl.mozilla.org/?tree=Try&rev=d724601d89bd
Attachment #8520461 -
Attachment is obsolete: true
Attachment #8522163 -
Flags: review+
Assignee | ||
Updated•10 years ago
|
Keywords: checkin-needed
Comment 34•10 years ago
|
||
Keywords: checkin-needed
Comment 35•10 years ago
|
||
Status: NEW → RESOLVED
Closed: 10 years ago
Resolution: --- → FIXED
You need to log in
before you can comment on or make changes to this bug.
Description
•