Closed Bug 941296 Opened 6 years ago Closed 5 years ago

PlatformDecoderModule for OSX

Categories

(Core :: Audio/Video, defect)

x86_64
macOS
defect
Not set

Tracking

()

RESOLVED FIXED
mozilla34

People

(Reporter: cpearce, Assigned: rillian)

References

Details

Attachments

(9 files, 21 obsolete files)

2.62 KB, patch
cpearce
: review+
ted
: review+
Details | Diff | Splinter Review
41.33 KB, patch
rillian
: review+
Details | Diff | Splinter Review
16.37 KB, patch
rillian
: review+
Details | Diff | Splinter Review
1.61 KB, patch
cpearce
: review+
Details | Diff | Splinter Review
2.13 KB, patch
cpearce
: review+
Details | Diff | Splinter Review
1.06 KB, patch
ted
: review+
Details | Diff | Splinter Review
823 bytes, patch
cpearce
: review+
Details | Diff | Splinter Review
3.41 KB, patch
cpearce
: review+
Details | Diff | Splinter Review
18.58 KB, application/zip
Details
We need a PlatformDecoderModule created for MacOSX before we can playback fragmented MP4 on MacOSX. We are also planning on adding non-fragmented MP4 support to our parser, and so we'll get regular MP4 support when we get a PlatformDecoderModule for MacOSX too.
Assignee: nobody → giles
Depends on: 962385
See Also: → 851290
Thanks to cpearce's help, I found the bug I was stuck on. Work in progress at https://github.com/rillian/firefox/tree/fmp4 if anyone's interested in following.
Snapshot. Requires special support for the avcC box, but I get output frames on https://people.mozilla.org/~rgiles/2014/bruce.html
Comment on attachment 8420478 [details] [diff] [review]
WIP osx PlatformDecoderModule, h.264 only

Review of attachment 8420478 [details] [diff] [review]:
-----------------------------------------------------------------

::: content/media/fmp4/osx/OSXVTDecoder.cpp
@@ +206,5 @@
> +                              &kCFTypeDictionaryKeyCallBacks,
> +                              &kCFTypeDictionaryValueCallBacks);
> +  NS_WARNING("HACK: loading AVC Configuration from external file");
> +  const char* avc_filename = "avcC.dat";
> +  FILE *avc_file = fopen(avc_filename, "rb");

Note: the avcc is in the extra_data field of the VideoDecoderConfig with kft's patches from bug 908503. You might want to rebase on top of kft's patches, since they're pretty much ready to land now, and they will almost certainly land before you do.

@@ +299,5 @@
> +  }
> +  return NS_OK;
> +}
> +
> +static const char* track_type_name(mp4_demuxer::TrackType type)

We should really just make TrackTypeToStr(TrackType aTrack) from MP4Reader.cpp visible here somehow.
Updated patch. Pulls the decoder config from the stream so works on general files now. Some problem with cropping for frame dimensions results in noise at the bottom of some videos.
Attachment #8420478 - Attachment is obsolete: true
Depends on: 1019291
Ralph, do you expect to finish this work before the uplift? That'd be really great!
Flags: needinfo?(cpearce)
Blocks: 799318
Wrong ni?, sorry for bugspam.
Flags: needinfo?(cpearce) → needinfo?(giles)
The PlatformDecoderModule is not the only piece in the puzzle for H.264 on OSX.
Thanks! What's left after that, then? (MSE doesn't count because that's site specific.)
(In reply to Florian Bender from comment #5)
> Ralph, do you expect to finish this work before the uplift? That'd be really
> great!

Unfortunately not. It will have to wait until 33.

As Anthony mentioned, this the the platform decoder module for OSX, which is called from the new demuxer. We will need to fix the bug tails on both before we can enable it by default.
Flags: needinfo?(giles)
Rebase on top of demuxer change to stagefright and bug 1019291 changes to Annex B generation.
Attachment #8421448 - Attachment is obsolete: true
Updated WIP patch.

- Removed some merge cruft.
- Switch to 'Apple' prefix from 'OSX' to match mp3 decoder and pref
- Use AutoCFRelease wrapper class to auto-release CoreFoundation objects.
Attachment #8444699 - Attachment is obsolete: true
Attached patch Enable fmp4 in the mac build (obsolete) — Splinter Review
Split out configure change to actually enable fmp4 when building on Mac.
Updated patch:

- Adds VideoToolbox stub so we build on 10.7.
- Dynamically load VideoToolbox so we can run on 10.7 & 10.6.
- Improve AutoCFRelease.
Attachment #8446227 - Attachment is obsolete: true
Updated patch to split out AAC decoder stub for easier review. Tested on 10.7 and 10.8. Unfortunately it looks like I need to dlopen CoreMedia as well: On 10.6 firefox fails to launch with:

 12:43:21     INFO -  dyld: Library not loaded: /System/Library/Frameworks/CoreMedia.framework/Versions/A/CoreMedia
Attachment #8451847 - Attachment is obsolete: true
Split out AAC decoder stub. Not functional.
Updated build system patch, split out for easier review.
Attachment #8446230 - Attachment is obsolete: true
Comment on attachment 8452632 [details] [diff] [review]
Part 1: OS X PlatformDecoderModule h.264 v6

Greenish on try: https://tbpl.mozilla.org/?tree=Try&rev=3f9711efe70b

Successfully runs non-functionally on 10.6 and 10.7.
Functional on 10.8. Crashes on 10.9, but ready for preliminary review while I look at that.
Attachment #8452632 - Flags: review?(cpearce)
Attachment #8452634 - Flags: review?(cpearce)
Comment on attachment 8452634 [details] [diff] [review]
Part 3: Enable fmp4 with APPLEMEDIA v6

Adding Ted for build peer review.
Attachment #8452634 - Flags: review?(ted)
Comment on attachment 8452634 [details] [diff] [review]
Part 3: Enable fmp4 with APPLEMEDIA v6

Review of attachment 8452634 [details] [diff] [review]:
-----------------------------------------------------------------

::: configure.in
@@ +5161,5 @@
> +if test "$MOZ_WIDGET_TOOLKIT" = "cocoa"; then
> +  MOZ_APPLEMEDIA=1
> +fi
> +
> +MOZ_ARG_DISABLE_BOOL(apple-media,

I know you're just moving this from down below, but is there a compelling need for this configure argument? Unless it adds a build-time requirement I think it ought to just be a default.
Attachment #8452634 - Flags: review?(ted) → review+
Thanks. This is platform-specific code so we need MOZ_APPLEMEDIA to conditionalize calling it, but I'm fine with removing the configure switch for it.

It was added in bug 914479. Edwin? Any objection to removing --disable-apple-media?
Flags: needinfo?(edwin)
(In reply to Ralph Giles (:rillian) from comment #20)
> It was added in bug 914479. Edwin? Any objection to removing
> --disable-apple-media?

Sounds good.
Flags: needinfo?(edwin)
Update patch to include dynamic load support for the CoreMedia framework, which isn't present on 10.6. In this case we depend on the system headers at compile time, since we build on 10.7.
Attachment #8452632 - Attachment is obsolete: true
Attachment #8452632 - Flags: review?(cpearce)
Attachment #8453389 - Flags: review?(cpearce)
Bump WIP AAC decoder patch. Not functional.
Attachment #8452633 - Attachment is obsolete: true
Update enable patch to remove --disable-apple-media per Ted's comment.
Attachment #8452634 - Attachment is obsolete: true
Attachment #8452634 - Flags: review?(cpearce)
Attachment #8453393 - Flags: review?(cpearce)
Attachment #8453393 - Flags: review?(ted)
Comment on attachment 8453389 [details] [diff] [review]
Part 1: OS X PlatformDecoderModule h.264 v7

Review of attachment 8453389 [details] [diff] [review]:
-----------------------------------------------------------------

Looking promising. A few big ticket items, but mostly nits.

::: content/media/fmp4/MP4Decoder.cpp
@@ +92,5 @@
>           IsVistaOrLater() ||
>  #endif
>           IsFFmpegAvailable() ||
> +#ifdef MOZ_APPLEMEDIA
> +         true ||

You should check the pref here.

::: content/media/fmp4/apple/AppleCMLinker.cpp
@@ +25,5 @@
> +
> +void* AppleCMLinker::sLink = nullptr;
> +
> +#define LINK_FUNC(func) typeof(func) func;
> +#include "AppleCMFunctions.h"

This is much easier than how I did this for Windows Media Foundation.

@@ +68,5 @@
> +{
> +  LOG("Unlinking CoreMedia framework.");
> +  if (sLink) {
> +    dlclose(sLink);
> +    sLink = nullptr;

What if there are more than one AppleDecoderModule active? Will the first AppleDecoderModule shutting down unlink the library and mean the rest no longer work?

Note that one PDM is initialized for every MP4Reader, i.e. each media element, and it's shutdown when the state machine is shutdown. So there are multiple PDMs, and each of them will call this function when they shutdown, so the first one to shutdown will annul sLink right?

Same thing is happening in the AppleVTLinker, right?

::: content/media/fmp4/apple/AppleDecoderModule.cpp
@@ +16,5 @@
> +extern PlatformDecoderModule* CreateBlankDecoderModule();
> +
> +bool AppleDecoderModule::sIsEnabled = false;
> +
> +AppleDecoderModule::AppleDecoderModule()

Do you need the ctor and dtor?

@@ +45,5 @@
> +  sIsEnabled = AppleVTLinker::Link();
> +}
> +
> +nsresult
> +AppleDecoderModule::Startup()

This function doesn't seem to do anything?

@@ +50,5 @@
> +{
> +  if (!sIsEnabled) {
> +    return NS_ERROR_FAILURE;
> +  }
> +  NS_WARNING("AppleDecoderModule::Startup()");

Is starting this puppy up really so dangerous that you need to warn about it? How about an NSPR log module?

@@ +59,5 @@
> +AppleDecoderModule::Shutdown()
> +{
> +  MOZ_ASSERT(NS_IsMainThread(), "Must be on main thread.");
> +
> +  if (!sIsEnabled) {

If some AppleDecoderModules are created, and the user flips the pref, you'll leak on shutdown right? It's a minor thing, but it's probably best to not check the flag here.

@@ +71,5 @@
> +}
> +
> +MediaDataDecoder*
> +AppleDecoderModule::CreateH264Decoder(const mp4_demuxer::VideoDecoderConfig& aConfig,
> +                                    mozilla::layers::LayersBackend aLayersBackend,

Nit: indentation off by two.

::: content/media/fmp4/apple/AppleUtils.h
@@ +11,5 @@
> +#include "nsError.h"
> +
> +namespace mozilla {
> +
> +//typedef uint32_t nsresult;

Dead code.

@@ +14,5 @@
> +
> +//typedef uint32_t nsresult;
> +
> +struct AppleUtils {
> +    // Helper to retrieve properties from AudioFileStream objects.

Should be 2 space indent here?

@@ +36,5 @@
> +
> +// Wrapper class to call CFRelease on reference types
> +// when they go out of scope.
> +template <class T>
> +class AutoCFRelease {

Can you specialize nsAutoRefTraits and use nsAutoRef, like AudioStream.h does for cubeb_stream? Or are there two many types that need to use this for it to be convenient?

::: content/media/fmp4/apple/AppleVTDecoder.cpp
@@ +29,5 @@
> +
> +namespace mozilla {
> +
> +AppleVTDecoder::AppleVTDecoder(const mp4_demuxer::VideoDecoderConfig& aConfig,
> +                           MediaTaskQueue* aVideoTaskQueue,

Indentation.

@@ +55,5 @@
> +
> +nsresult
> +AppleVTDecoder::Init()
> +{
> +  NS_WARNING(__func__);

NSPR Log, and below.

@@ +165,5 @@
> +// FIXME: probably better to queue this as a new task
> +// to avoid this thread-unsafe public member function.
> +nsresult
> +AppleVTDecoder::OutputFrame(CVPixelBufferRef aImage,
> +                          mp4_demuxer::MP4Sample* aSample)

Indentation off by two again.

@@ +210,5 @@
> +  buffer.mPlanes[2].mSkip = 1;
> +
> +  CVReturn rv = CVPixelBufferLockBaseAddress(aImage, kCVPixelBufferLock_ReadOnly);
> +  MOZ_ASSERT(rv == kCVReturnSuccess, "error locking pixel data");
> +  memcpy(frame, CVPixelBufferGetBaseAddressOfPlane(aImage, 0), width*height);

Here you copy the frame into another buffer, and then you call VideoData::Create() below, which will copy it again as it either uploads to the GPU, or to another CPU memory buffer surface. What you should do is lock the frame, and then set buffer.mPlanes[0].mData to the VPixelBufferGetBaseAddressOfPlane(aImage, 0), etc, then call VideoData::Create(), then unlock the pixel buffer. Then you avoid one unnecessary copy.

@@ +234,5 @@
> +                      aSample->is_sync_point,
> +                      aSample->composition_timestamp,
> +                      visible);
> +  mCallback->Output(data.forget());
> +  return NS_OK;

Do you need to delete aSample here?

@@ +243,5 @@
> +TimingInfoFromSample(mp4_demuxer::MP4Sample* aSample)
> +{
> +  CMSampleTimingInfo timestamp;
> +
> +  // FIXME: check units here.

Yes, please check the units here.

@@ +244,5 @@
> +{
> +  CMSampleTimingInfo timestamp;
> +
> +  // FIXME: check units here.
> +  const int32_t msec_per_sec = 1000000;

Use USECS_PER_S from VideoUtils.h.

@@ +249,5 @@
> +
> +  timestamp.duration = CMTimeMake(aSample->duration, msec_per_sec);
> +  timestamp.presentationTimeStamp =
> +    CMTimeMake(aSample->composition_timestamp, msec_per_sec);
> +  // No DTS value available from libstagefright.

Do you need a dts? We could get libstagefright to output it.

@@ +282,5 @@
> +  NS_ASSERTION(rv == noErr, "Couldn't create CMBlockBuffer");
> +  CMSampleTimingInfo timestamp = TimingInfoFromSample(aSample);
> +  rv = CMSampleBufferCreate(NULL, block, true, 0, 0, mFormat, 1, 1, &timestamp, 0, NULL, sample.receive());
> +  NS_ASSERTION(rv == noErr, "Couldn't create CMSampleBuffer");
> +  rv = VTDecompressionSessionDecodeFrame(mSession, sample, 0, aSample, &flags);

So your PlatformCallback is called when this has enough input data to produce an output frame? What happens when multiple input frames are required to produce an output frame? It looks to me like the nsAutoPtr in the RunnableMethod that calls this function will delete the MP4Sample when this function call completes, and then when we're called back with more input later, we'll produce an output sample and the callback will be called with a pointer to either a now deleted MP4Sample, or the one that was last input and which doesn't have a timestamp corresponding to the sample output? Either of which are bad things to happen.

I've been thinking that we should make MP4Sample refcounted, and this just adds weight to the argument in favour of doing that.

@@ +302,5 @@
> +  AutoCFRelease<CFMutableDictionaryRef> extensions =
> +    CFDictionaryCreateMutable(NULL, 0,
> +                              &kCFTypeDictionaryKeyCallBacks,
> +                              &kCFTypeDictionaryValueCallBacks);
> +#if 1

Remove "#if 1"

@@ +336,5 @@
> +                                      mConfig.display_width,
> +                                      mConfig.display_height,
> +                                      extensions,
> +                                      &mFormat);
> +  // FIXME: propagate errors to caller.

Yes, please propagate the errors to the caller. ;)

::: content/media/fmp4/apple/AppleVTDecoder.h
@@ +23,5 @@
> +
> +class AppleVTDecoder : public MediaDataDecoder {
> +public:
> +  AppleVTDecoder(const mp4_demuxer::VideoDecoderConfig& aConfig,
> +               MediaTaskQueue* aVideoTaskQueue,

Indentation off by two here.
Attachment #8453389 - Flags: review?(cpearce) → review-
Attachment #8453393 - Flags: review?(cpearce) → review+
Attachment #8453393 - Flags: review?(ted) → review+
Blocks: 1037553
Depends on: 1037689
Thanks for the prompt review. Your comments were really helpful. Here's an updated patch addressing the issues, and rebased on top of other m-c changes.

Detailed response:

> ::: content/media/fmp4/MP4Decoder.cpp
> 
> You should check the pref here.

Ok. I also checked platform support, to match what the FFmpeg code does.

> ::: content/media/fmp4/apple/AppleCMLinker.cpp
> > +
> > +#define LINK_FUNC(func) typeof(func) func;
> > +#include "AppleCMFunctions.h"
> 
> This is much easier than how I did this for Windows Media Foundation.

Yes! It is very clever code from Edwin.

> > +  LOG("Unlinking CoreMedia framework.");
> > +  if (sLink) {
> > +    dlclose(sLink);
> > +    sLink = nullptr;
> 
> What if there are more than one AppleDecoderModule active? Will the first AppleDecoderModule shutting down unlink the library and mean the rest no longer work?

You're right. We use Unlink() during error cleanup, but should never call
it once linking succeeds. I've removed the external calls and made the
method private to prevent re-occurance.

The initial check for sLink in Link() should prevent Unlink() being
called more than once.

Ah, but layout/build/nsLayoutStatics.cpp calls FFmpegRuntimeLinker::Unlink().
I wonder why I don't get leak complaints on shutdown?

> Same thing is happening in the AppleVTLinker, right?

Yes, the code is identical except for names.

> > +AppleDecoderModule::AppleDecoderModule()
> 
> Do you need the ctor and dtor?

PlatformDecoderModule::Create() calls 'new AppleDecoderModule()'
so I believe so. I have an empty ctor and an empty virtual dtor
just like WMFDecoderModule. Have you thought of a better way
to do this?

If I remove mBlankDecoder (not needed once AAC decoding works)
all the state would be static, so I could add an IsEnabled()
method to use instead of Startup() (see below) and drop the ctor/dtor.

> > +nsresult
> > +AppleDecoderModule::Startup()
> 
> This function doesn't seem to do anything?

Like with WMFDecoderModule, it checks whether ::Init() succeeded,
which PlatformDecoderModule::Create() uses to validate the instance
before returning.

I've added a comment.

> > +  NS_WARNING("AppleDecoderModule::Startup()");
> 
> Is starting this puppy up really so dangerous that you need to warn about it? How about an NSPR log module?

Sorry. I've removed the tracing code or converted it to LOG() calls.

> > +AppleDecoderModule::Shutdown()
> > +{
> > +  MOZ_ASSERT(NS_IsMainThread(), "Must be on main thread.");
> > +
> > +  if (!sIsEnabled) {
> 
> If some AppleDecoderModules are created, and the user flips the pref, you'll leak on shutdown right? It's a minor thing, but it's probably best to not check the flag here.

Right. I've removed the check.

> > +AppleDecoderModule::CreateH264Decoder(const mp4_demuxer::VideoDecoderConfig& aConfig,
> > +                                    mozilla::layers::LayersBackend aLayersBackend,
> 
> Nit: indentation off by two.

Fixed.

> ::: content/media/fmp4/apple/AppleUtils.h
> > +
> > +//typedef uint32_t nsresult;
> 
> Dead code.

Fixed.

> > +struct AppleUtils {
> > +    // Helper to retrieve properties from AudioFileStream objects.
> 
> Should be 2 space indent here?

Yep. Fixed.

> > +// Wrapper class to call CFRelease on reference types
> > +// when they go out of scope.
> > +template <class T>
> > +class AutoCFRelease {
> 
> Can you specialize nsAutoRefTraits and use nsAutoRef, like AudioStream.h does for cubeb_stream? Or are there two many types that need to use this for it to be convenient?

nsAutoRefTraits is very cool, but we have five types so far which is a lot
of duplicate declarations one has to remember to update. I also like the
more transparent naming of a custom template class. This is a C api, so
there's no way to infer which types need the CFRelease() call.

Hmm...unless I could *define* AutoCFRelease<> in terms of an
nsAutoRefTraits nsAutoRef. Let me look at that.

> > +AppleVTDecoder::AppleVTDecoder(const mp4_demuxer::VideoDecoderConfig& aConfig,
> > +                           MediaTaskQueue* aVideoTaskQueue,
> 
> Indentation.

Fixed.

> > +AppleVTDecoder::OutputFrame(CVPixelBufferRef aImage,
> > +                          mp4_demuxer::MP4Sample* aSample)
> 
> Indentation off by two again.

Fixed.

> > +  memcpy(frame, CVPixelBufferGetBaseAddressOfPlane(aImage, 0), width*height);
> 
> Here you copy the frame into another buffer, and then you call VideoData::Create() below, which will copy it again as it either uploads to the GPU, or to another CPU memory buffer surface. What you should do is lock the frame, and then set buffer.mPlanes[0].mData to the VPixelBufferGetBaseAddressOfPlane(aImage, 0), etc, then call VideoData::Create(), then unlock the pixel buffer. Then you avoid one unnecessary copy.

Much better, thanks!

> > +  mCallback->Output(data.forget());
> > +  return NS_OK;
> 
> Do you need to delete aSample here?

No. This is called from a runnable with an nsAutoPtr argument,
which frees it after we return.

> Yes, please check the units here.

Confirmed MP4Sample values should me in microseconds,
CMTime timescale converts to seconds. Filed bug 1037689
to add a comment to the stagefright binding.

> > +  const int32_t msec_per_sec = 1000000;
> 
> Use USECS_PER_S from VideoUtils.h.

Done.

> > +    CMTimeMake(aSample->composition_timestamp, msec_per_sec);
> > +  // No DTS value available from libstagefright.
> 
> Do you need a dts? We could get libstagefright to output it.

It seems to work passing the PST for DTS. Getting a real
DTS would be nicer. I tried passing kCMTimeInvalid, but that
does fail.

I filed bug 1037553 to discuss this.

> So your PlatformCallback is called when this has enough input data to produce an output frame? What happens when multiple input frames are required to produce an output frame? It looks to me like the nsAutoPtr in the RunnableMethod that calls this function will delete the MP4Sample when this function call completes, and then when we're called back with more input later, we'll produce an output sample and the callback will be called with a pointer to either a now deleted MP4Sample, or the one that was last input and which doesn't have a timestamp corresponding to the sample output? Either of which are bad things to happen.
> 
> I've been thinking that we should make MP4Sample refcounted, and this just adds weight to the argument in favour of doing that.

The current code _copies_ the compressed data before submitting the frame
to Apple's API, so there's no lifetime issue.

The docs say it's possible construct a lazy CMBlockBuffer which
reads data through callbacks. That would allow us to avoid the copy,
but then we'd need a way to keep the sample alive.

Making it refcounted is the easiest way to do that, but not necessary;
this class could take ownership itself and take care of freeing the
appropriate one from the output callback.

In any case, I'd like to land this version first before optimizing.

> Remove "#if 1"

Done.

> > +  // FIXME: propagate errors to caller.
> 
> Yes, please propagate the errors to the caller. ;)

Done.

> > +  AppleVTDecoder(const mp4_demuxer::VideoDecoderConfig& aConfig,
> > +               MediaTaskQueue* aVideoTaskQueue,
> 
> Indentation off by two here.

Fixed.
Attachment #8453389 - Attachment is obsolete: true
Attachment #8454867 - Flags: review?(cpearce)
Comment on attachment 8454867 [details] [diff] [review]
Part 1: OS X PlatformDecoderModule h.264 v8

Review of attachment 8454867 [details] [diff] [review]:
-----------------------------------------------------------------

I'm still not convinced about the handling of MP4Sample*s, and you're still leaking the libraries loaded AFAICT.

::: content/media/fmp4/apple/AppleVTDecoder.cpp
@@ +109,5 @@
> +}
> +nsresult
> +AppleVTDecoder::Flush()
> +{
> +  return NS_OK;

Do you need to do something here, otherwise you'll get the wrong/incorrect video frame output after seeking?

@@ +115,5 @@
> +
> +nsresult
> +AppleVTDecoder::Drain()
> +{
> +  return NS_OK;

Do you need to do something here to ensure the decoder outputs all decoded frames it has pending here?

@@ +278,5 @@
> +  NS_ASSERTION(rv == noErr, "Couldn't create CMBlockBuffer");
> +  CMSampleTimingInfo timestamp = TimingInfoFromSample(aSample);
> +  rv = CMSampleBufferCreate(NULL, block, true, 0, 0, mFormat, 1, 1, &timestamp, 0, NULL, sample.receive());
> +  NS_ASSERTION(rv == noErr, "Couldn't create CMSampleBuffer");
> +  rv = VTDecompressionSessionDecodeFrame(mSession, sample, 0, aSample, &flags);

You're passing aSample into VTDecompressionSessionDecodeFrame here, and MacOSX is passing the pointer back to you in PlatformCallback() (synchronously right?). Is there exactly one PlatformCallback() every time you call VTDecompressionSessionDecodeFrame()? I doubt it, because sometimes multiple inputs are required for 1 or more outputs. Or does PlatformCallback() get called with no image and the MP4Sample* passed back and we hit your NS_WARNING("VideoToolbox decoder returned no data") path and exit the callback without output?

OR does MacOSX hold onto the MP4Sample pointers until all the data required to decode the frame corresponding to the CMBlockBuffer here has been input and *then* it calls the callback, which would result in 0 or more calls to your PlatformCallback() per every VTDecompressionSessionDecodeFrame call. If that's the case, the MP4Sample pointers may be stale when the PlatformCallback is called. Note you're dereferencing the MP4Sample pointer in AppleVTDecoder::OutputFrame(), so it's not just enough to copy the MP4Sample->data and assume everything is OK here.

::: content/media/fmp4/apple/AppleVTLinker.cpp
@@ +60,5 @@
> +  return false;
> +}
> +
> +/* static */ void
> +AppleVTLinker::Unlink()

I couldn't find where this was called (except in the failure case) so it'll leak right? i.e. the memory required to keep the libraries in memory won't be released until Firefox closes. You probably need to call Unlink() in your PDM::Shutdown() function, and keep a manual refcount on the number of time's dlopen/dlclose have been called, and null sLink when the refcount is 0.
Attachment #8454867 - Flags: review?(cpearce) → review-
(In reply to Chris Pearce (:cpearce) from comment #27)

> OR does MacOSX hold onto the MP4Sample pointers until all the data required
> to decode the frame corresponding to the CMBlockBuffer here has been input
> and *then* it calls the callback, which would result in 0 or more calls to
> your PlatformCallback() per every VTDecompressionSessionDecodeFrame call. If
> that's the case, the MP4Sample pointers may be stale when the
> PlatformCallback is called. Note you're dereferencing the MP4Sample pointer
> in AppleVTDecoder::OutputFrame(), so it's not just enough to copy the
> MP4Sample->data and assume everything is OK here.

Ah, I see what you mean now. Yes, we are potentially referring to the sample object after it's been released by the runnable. I guess I'll copy the data for now until we can make MP4Sample refcounted.

> I couldn't find where this was called (except in the failure case) so it'll
> leak right? i.e. the memory required to keep the libraries in memory won't
> be released until Firefox closes. You probably need to call Unlink() in your
> PDM::Shutdown() function, and keep a manual refcount on the number of time's
> dlopen/dlclose have been called, and null sLink when the refcount is 0.

Do I need a Monitor to protect access to a refcount inside the Linker object, or am I guaranteed the PlatformDecoderModule methods will all be called on the same thread?
(In reply to Ralph Giles (:rillian) from comment #28)
> (In reply to Chris Pearce (:cpearce) from comment #27)

> > I couldn't find where this was called (except in the failure case) so it'll
> > leak right? i.e. the memory required to keep the libraries in memory won't
> > be released until Firefox closes. You probably need to call Unlink() in your
> > PDM::Shutdown() function, and keep a manual refcount on the number of time's
> > dlopen/dlclose have been called, and null sLink when the refcount is 0.
> 
> Do I need a Monitor to protect access to a refcount inside the Linker
> object, or am I guaranteed the PlatformDecoderModule methods will all be
> called on the same thread?

It's a hassle making a static ref to a monitor threadsafe, so just assert the addref/release is called on the same thread for now.
Updated patch:

- Copy frame metadata into a new FrameRef object and pass that through our callbacks. Fixes access to MP4Sample fields after our Runnable releases them.

- Keep a static, manual refcount on the framework runtime linkers, and call their Unlink() methods from AppleDecoderModule::Shutdown().

- Clear pending frames from AppleVTDecoder::Flush() and ::Drain().

Flush() and Drain() will now block until currently queued decoding completes, but the docs don't say whether it clears pending prediction state as well. I haven't seen any artifacts in testing so far, but then I didn't without the change.
Attachment #8454867 - Attachment is obsolete: true
Attachment #8456569 - Flags: review?(cpearce)
Comment on attachment 8456569 [details] [diff] [review]
Part 1: OS X PlatformDecoderModule h.264 v9

Review of attachment 8456569 [details] [diff] [review]:
-----------------------------------------------------------------

::: content/media/fmp4/apple/AppleVTDecoder.cpp
@@ +110,5 @@
> +
> +nsresult
> +AppleVTDecoder::Flush()
> +{
> +  return Drain();

I don't think this is right, but I can't find any Apple docs to say which function we should call here... 

I did find https://developer.apple.com/library/prerelease/ios/samplecode/VideoTimeline/Listings/VideoTimeLine_AAPLViewController_m.html#//apple_ref/doc/uid/TP40014525-VideoTimeLine_AAPLViewController_m-DontLinkElementID_9 and http://cgit.freedesktop.org/gstreamer/gst-plugins-bad/commit/?id=d9db0c2d4aea614b92602f898f3ebdef1b8e3ae8 and https://code.google.com/p/ossbuild/source/browse/trunk/Main/GStreamer/Source/gst-plugins-bad/sys/applemedia/vtdec.c?r=929 however.

Flush() is supposed to cause every frame in the decoder to be dropped, so that the decoder is ready to accept new input after a seek position. But the gst-plugins-bad applemedia source code makes me think that VTDecompressionSessionWaitForAsynchronousFrames() causes waits for callbacks from the VTDecompressionSessionDecodeFrame() call when it was called with kVTDecodeFrame_EnableAsynchronousDecompression flag set. But I think you're operating in synchronous mode here.

If you're seeking and you're not seeing visual artifacts, then we should do nothing here and wait for someone to file a bug proving we need to do something here. Or if you are seeing artifacts, then maybe for Flush() you want VTDecompressionSessionInvalidate()?

@@ +116,5 @@
> +
> +nsresult
> +AppleVTDecoder::Drain()
> +{
> +  OSStatus rv = VTDecompressionSessionWaitForAsynchronousFrames(mSession);

I'd guess that VTDecompressionSessionWaitForAsynchronousFrames only has an effect if you set the kVTDecodeFrame_EnableAsynchronousDecompression flag when you call VTDecompressionSessionDecodeFrame()?

Do samples come out in PTS order or DTS order? It looks like you should be setting the kVTDecodeFrame_EnableTemporalProcessing flag to enforce the former, and if you did that, you would you need to call VTDecompressionSessionFinishDelayedFrames() here to drain the pipeline?

You might even need to call VTDecompressionSessionFinishDelayedFrames() here anyway in sync mode, to flush any samples being retained due.

Also, note that Drain() isn't called yet, since I never got around to fixing bug 973710. D'oh! I'll need to get around to that...
From the comment on MediaDataDecoder::Flush() I thought it was still helpful to block while in-process decoding finished, since any output still propagating to our callback would be discarded. VTDecompressionSessionWaitForAsynchronousFrames() calls VTDecompressionSessionFinishDelayedFrames(). I chose the former so we wouldn't have to change it if we later enable async decoding, and it sounded like it did no harm.

I think we should do what the current patch does, unless blocking while in-progress callbacks are discarded isn't helpful, and wait for someone to file a bug if it doesn't work. At which point the only way I see to interrupt is to Invalidate and re-Create the session, as you say.

I'll check the output frame order. I didn't see any artefacts without kVTDecodeFrame_EnableTemporalProcessing, so I got the impression it was for when the _input_ was in PTS order. Looking at the header docs, it does sound like we should be setting it.
Comment on attachment 8456569 [details] [diff] [review]
Part 1: OS X PlatformDecoderModule h.264 v9

Review of attachment 8456569 [details] [diff] [review]:
-----------------------------------------------------------------

OK. It would be good if we could make the flush faster by dropping samples, but if the only way to do it is to tear down the decoder and recreate it, then that may not be faster.
Attachment #8456569 - Flags: review?(cpearce) → review+
Frames go in and come out in decode order, so I think the compository is discarding frames my code returns out of order. And I think EnableTemporalProcessing means we can feed them in in _presentation_ order and VideoToolbox will reorder before decode, not that it will reorder _decode_ order frames into presentation order.

If I add a 3-frame reorder buffer in OutputFrame, like :edwin had in his original FFmpegH264Decoder patch from bug 941298 I get complete playback on test.mp4 with all decoded frames presented. So we need to parse the max reorder depth out of the SPS header and use that to determine the depth of the reorder queue.
Separate the re-ordering and SPS parsing out into a separate bug and get back to it after the AAC stuff.
Port AppleMP3Decoder to the platform decoder module. Seems to work on goal.html, test.mp4 from the dash suite and the few vimeo pages I tried. 

Doesn't work on bruce.html.
Attachment #8453391 - Attachment is obsolete: true
Attachment #8460580 - Flags: review?(edwin)
Comment on attachment 8460580 [details] [diff] [review]
Part 2: OS X PlatformDecoderModule AAC v10

Review of attachment 8460580 [details] [diff] [review]:
-----------------------------------------------------------------

lgtm. just tiny style nits.

::: content/media/fmp4/apple/AppleATDecoder.cpp
@@ +153,5 @@
> +}
> +
> +struct PassthroughUserData {
> +    AppleATDecoder* mDecoder;
> +    UInt32 mNumPackets;

nit: 2 space indent

@@ +241,5 @@
> +                                                  &decBuffer,
> +                                                  packets.get());
> +
> +    if (rv && rv != kNeedMoreData) {
> +      LOG("Error decoding audio stream: %x\n", rv);

"%#x", in case of radix-ambiguous (radically ambiguous?) values of rv.

@@ +246,5 @@
> +      break;
> +    }
> +    LOG("%d frames decoded", numFrames);
> +
> +    // If we decoded zero frames then AudiOConverterFillComplexBuffer is out

nit: "AudiO"

@@ +260,5 @@
> +    int64_t time = FramesToUsecs(mCurrentAudioFrame, rate).value();
> +    int64_t duration = FramesToUsecs(numFrames, rate).value();
> +
> +    LOG("pushed audio at time %lfs; duration %lfs\n",
> +         (double)time / USECS_PER_S, (double)duration / USECS_PER_S);

nit: paren alignment
Attachment #8460580 - Flags: review?(edwin) → review+
Thanks, Edwin. Updated patch to address nits, carrying forward r=edwin.

Chris, if you have time, I'd appreciate it if you could review this AAC decoder patch for interaction with the fmp4 code as well.
Attachment #8460580 - Attachment is obsolete: true
Attachment #8460990 - Flags: review?(cpearce)
Comment on attachment 8460990 [details] [diff] [review]
Part 2: OS X PlatformDecoderModule AAC v11

Review of attachment 8460990 [details] [diff] [review]:
-----------------------------------------------------------------

(discussed patch over the water-cooler...)
Attachment #8460990 - Flags: review?(cpearce)
Rebase h.264 decoder patch after changes from bug 1041860.

Carrying forward r=cpearce.
Attachment #8456569 - Attachment is obsolete: true
Attachment #8461896 - Flags: review+
Update AAC decoder patch based on conversation with :cpearce yesterday.

- Call InputExhausted() from Input() if we haven't yet produced output.
- Dispatch decoding to mTaskQueue.
- Implement Flush and Drain.
- Declare our input as ADTS; doesn't seem to matter.
- Fix leaks.
Attachment #8460990 - Attachment is obsolete: true
Attachment #8461901 - Flags: review?(cpearce)
Attachment #8461896 - Attachment description: Part 1: OS X PlatformDecoderModule h.264 v11 → Part 1: OS X PlatformDecoderModule h.264 v12
Comment on attachment 8461901 [details] [diff] [review]
Part 2: OS X PlatformDecoderModule AAC v12

Review of attachment 8461901 [details] [diff] [review]:
-----------------------------------------------------------------

Ship it!

Now, try exposing the MP4Reader in normal <video> on MacOSX, and see what tests fail. We want H.264/AAC on MacOSX in <video> ASAP. :)

::: content/media/fmp4/apple/AppleATDecoder.cpp
@@ +226,5 @@
> +  // Descriptions for _decompressed_ audio packets. ignored.
> +  nsAutoArrayPtr<AudioStreamPacketDescription>
> +      packets(new AudioStreamPacketDescription[MAX_AUDIO_FRAMES]);
> +
> +  // This API insists on having MP3 packets spoon-fed to it from a callback.

s/MP3//

Right?
Attachment #8461901 - Flags: review?(cpearce) → review+
Update patch to remove MP3 reference. Carrying forward r=edwin,cpearce

Thanks, Chris!
Attachment #8461901 - Attachment is obsolete: true
Attachment #8462104 - Flags: review+
Blocks: 1043695
Blocks: 1043696
Blocks: 1043710
Unfortunately it looks like I have to stub out the CoreMedia headers. I thought from the fact that we build on 10.7, where it's available, and that previous debug builds on try had passed, that we could use the system headers.

It appears the tryserver's opt build (and opt build ONLY) uses '-isysroot /Developer/SDKs/MacOSX10.6.sdk' which blocks access. I'll verify on my own 10.7 machine with the release mozconfig.

Or maybe the build peers will let me CPPFLAGS += -F/System/Library/Frameworks?

Don't understand the FFmpegRuntimeLinker failures yet.
I was able to reproduce the MacOS X opt build issue with browser/config/mozconfigs/macosx-universal/nightly and Xcode 4.2.1 with the MacOSX10.6.sdk installed on MacOS X 10.7.5.

The problem is that I was expecting to build against the CoreMedia.framework headers, but they're not available when we specify the 10.6 SDK.

Ted, are you ok with this solution? We dlopen() VideoToolbox and CoreMedia, so this doesn't break runtime support for old MacOS. For VideoToolbox we include our own stub header with the four functions we need, but there are a lot more for CoreMedia, so instead I generated headers to proxy the corresponding headers from the system copy of the framework. CoreMedia is available on 10.7 and later, so they should always be available at build time.
Attachment #8463061 - Flags: review?(ted)
Attachment #8463061 - Flags: review?(cpearce)
Some of the debug code generates unused-variable warnings in non-debug builds.

Move debug-only variables inside and #ifdef, and report lock failures to the caller instead of just asserting, which is also debug-only.
Attachment #8463063 - Flags: review?(cpearce)
Turns out we can't interchange uint32_t and UInt32 after all. It works in 64 bit builds, but on 32 bit the compiler wants a cast from one pointer type to the other.
Attachment #8463064 - Flags: review?(cpearce)
With these additional patches the feature is green on OSX try. FFmpeg problems still todo.

https://tbpl.mozilla.org/?tree=Try&rev=e14fbcc292d6
Comment on attachment 8463061 [details] [diff] [review]
Part 1.1: Add wrappers to include system CoreMedia headers

Review of attachment 8463061 [details] [diff] [review]:
-----------------------------------------------------------------

::: content/media/fmp4/apple/AppleCMLinker.h
@@ -9,4 @@
>  
>  extern "C" {
>  #pragma GCC visibility push(default)
> -#include <CoreMedia/CoreMedia.h>

Why can't you just #include "/System/Library/Frameworks/CoreMedia.framework/Headers/CoreMedia.h" here? Why do you need the wrapper file?
Attachment #8463063 - Flags: review?(cpearce) → review+
Attachment #8463064 - Flags: review?(cpearce) → review+
(In reply to Chris Pearce (:cpearce) from comment #50)

> Why can't you just #include
> "/System/Library/Frameworks/CoreMedia.framework/Headers/CoreMedia.h" here?
> Why do you need the wrapper file?

Some of the CoreMedia sub-headers we need include each other, relying on the compiler to mangle <CoreMedia/Foo.h> -> "CoreMedia.framework/Headers/Foo.h". The wrappers were the best way I could come up with the re-create that mangling.

We could add /System/Library/Frameworks to the framework search path, but that will affect all includes, which could create version skew issues with the rest of the build, so I don't think that's a good idea.
Comment on attachment 8463061 [details] [diff] [review]
Part 1.1: Add wrappers to include system CoreMedia headers

Review of attachment 8463061 [details] [diff] [review]:
-----------------------------------------------------------------

Is there a reason we wouldn't just switch to using the 10.7 SDK? That seems like less hassle. We've done that in the past (use an SDK higher than the lowest OS version we're targeting), and we do the same thing on Windows.
Attachment #8463061 - Flags: review?(ted)
Depends on: 1045128
If we can bump the target SDK version for release builds to 10.7 we can just include the framework headers at build time as before.

Add a configure check and report the required SDK version if it's not available.
Attachment #8463061 - Attachment is obsolete: true
Attachment #8463061 - Flags: review?(cpearce)
Attachment #8463579 - Flags: review?(ted)
Attachment #8463579 - Attachment description: Part 5: Configure check for CoreMedia → Part 4: Configure check for CoreMedia
Fixes the FFmpeg/OMX build problem. That was a more obscure error message than I've see in a while.
Attachment #8463653 - Flags: review?(cpearce)
Attachment #8463653 - Flags: review?(cpearce) → review+
All patches from this bug, plus the sdk bump from bug 1045128.

https://tbpl.mozilla.org/?tree=Try&rev=e0b90861b228
Attached patch Fix compilation on OS X 10.9 (obsolete) — Splinter Review
Bug 941296 - Allow compilation on 10.9
Attachment #8464470 - Flags: review?(giles)
Assignee: giles → jyavenard
Status: NEW → ASSIGNED
never wanted it !
Assignee: jyavenard → giles
Comment on attachment 8464470 [details] [diff] [review]
Fix compilation on OS X 10.9

Review of attachment 8464470 [details] [diff] [review]:
-----------------------------------------------------------------

Thanks for the patch. You forgot to include the VideoToolbox.h -> MozVideoToolbox.h change, and there's spurious #endif removal.

This bug is too long already. Please file a new bug with a copy of the build error; we can talk about it there. If we really don't build on 10.9 that would be bad. Please also elaborate on what you mean by issue 1. We deliberately don't link CoreVideo because it's not available prior to 10.8 and we build releases on 10.7.
Attachment #8464470 - Flags: review?(giles) → review-
Attached patch Fix compilation on OS X 10.9 (obsolete) — Splinter Review
Bug 941296 - Fix compilation on OS X 10.9
Attachment #8464610 - Flags: review?(giles)
Assignee: giles → jyavenard
Comment on attachment 8464610 [details] [diff] [review]
Fix compilation on OS X 10.9

wrong bug #
Attachment #8464610 - Attachment is obsolete: true
Attachment #8464610 - Flags: review?(giles)
Assignee: jyavenard → giles
Depends on: 1046005
Attachment #8463579 - Flags: review?(ted) → review+
Fix a problem with non-unified builds.
Attachment #8464862 - Flags: review?(cpearce)
Need something for NS_IsMainThread() as well.
Attachment #8464862 - Attachment is obsolete: true
Attachment #8464862 - Flags: review?(cpearce)
Attachment #8464868 - Flags: review?(cpearce)
One more change.
Attachment #8464868 - Attachment is obsolete: true
Attachment #8464868 - Flags: review?(cpearce)
Attachment #8464882 - Flags: review?(cpearce)
Attachment #8464882 - Flags: review?(cpearce) → review+
Comment on attachment 8464470 [details] [diff] [review]
Fix compilation on OS X 10.9

This was updated and landed in bug 1046005.
Attachment #8464470 - Attachment is obsolete: true
So this bug's patch now requires the 10.7 SDK or higher.  I just found out about it when I could no longer build the tree :-(

It's *possible* that doing this will cause us no major trouble on 10.6, but we need to keep our eyes open.
About a third of our Mac users are still on SnowLeopard (10.6).
(In reply to Steven Michaud from comment #70)

> It's *possible* that doing this will cause us no major trouble on 10.6, but
> we need to keep our eyes open.

Indeed. Bug 1045128 is probably a better place to track the issue. Or new bugs, of course. :)

Thanks for posting to dev-platform.
Blocks: 1046301
Depends on: 1047486
Okay, Auorara 34, I shall wait for ya :-D

This will enable H.264 on Mavericks, right? 

If yes, I'll probably be one step closer for ditching the other things for watching Vimeo and TED!
Yes, this should enable h.264+aac in mp4 playback in the <video> tag on mavericks and some other versions.
(In reply to Ralph Giles (:rillian) from comment #76)
> Yes, this should enable h.264+aac in mp4 playback in the <video> tag on
> mavericks and some other versions.

Can you elborate on which versions you are talking about. It seems to be working fine on Mavericks, but it doesn't seem to work on Yosemite beta 2 - well the audio plays fine, but the video doesn't. Is this a known issue?
Thanks for the update Ralph.
Can someone please push it to the Nightly? The 34.0a1 dmg does not seem to have this patch.
(In reply to Christian Aarfing from comment #77)

> Can you elborate on which versions you are talking about. It seems to be
> working fine on Mavericks, but it doesn't seem to work on Yosemite beta 2 -
> well the audio plays fine, but the video doesn't. Is this a known issue?

It works for me on MacOS X 10.7 through 10.10. The video failure on Yosemite is bug 1054789.

> Can someone please push it to the Nightly? The 34.0a1 dmg does not seem to have this patch.

It's been in nightly for a couple of weeks behind a pref. I just flipped the pref tonight in bug 1043696, so it will be available in Nightly probably on Friday.
(In reply to Ralph Giles (:rillian) from comment #79)
> (In reply to Christian Aarfing from comment #77)
> 
> > Can you elborate on which versions you are talking about. It seems to be
> > working fine on Mavericks, but it doesn't seem to work on Yosemite beta 2 -
> > well the audio plays fine, but the video doesn't. Is this a known issue?
> 
> It works for me on MacOS X 10.7 through 10.10. The video failure on Yosemite
> is bug 1054789.
> 
Ah ok, thanks. Great job by the way :)
Ok, just tested the latest Nightly. Works on Mac OSX Mavericks 10.9.4 and it works great!
Thanks to Giles and everyone else who worked on it :)


Seeing this in about:plugins

OpenH264 Video Codec provided by Cisco Systems, Inc.
    File: 1.0
    Path: /Users/MrBanks/Library/Application Support/Firefox/Profiles/teo7jfff.default/gmp-gmpopenh264/1.0
    Version: 1.0
    State: Enabled
    Play back web video and use video chats.
Tested the nightly on 10.10 DP6 and it happily plays videos on youtube and vimeo, great job! Unfortunately it fails to play on SVT Play, which works in Safari: http://www.svtplay.se/video/318873/testvideo-synlig-i-hela-varlden

(Not sure if this is the correct bug to report playback results in, feel free to redirect me)
(In reply to Per Olofsson from comment #82)
> Tested the nightly on 10.10 DP6 and it happily plays videos on youtube and
> vimeo, great job! Unfortunately it fails to play on SVT Play, which works in
> Safari: http://www.svtplay.se/video/318873/testvideo-synlig-i-hela-varlden
> 
> (Not sure if this is the correct bug to report playback results in, feel
> free to redirect me)

For that one, I also get "install flash" on Linux where I have had h264 playback for a while. They must be doing something like "if firefox then flash", like vimeo did for a while.
(In reply to Per Olofsson from comment #82)
> Tested the nightly on 10.10 DP6 and it happily plays videos on youtube and
> vimeo, great job! Unfortunately it fails to play on SVT Play, which works in
> Safari: http://www.svtplay.se/video/318873/testvideo-synlig-i-hela-varlden
> 
> (Not sure if this is the correct bug to report playback results in, feel
> free to redirect me)

Windows Nightly 2014-08-29 (on Windows 8.1) doesn't play it as well when Flash is disabled.
(In reply to Per Olofsson from comment #82)
> Tested the nightly on 10.10 DP6 and it happily plays videos on youtube and
> vimeo, great job! Unfortunately it fails to play on SVT Play, which works in
> Safari: http://www.svtplay.se/video/318873/testvideo-synlig-i-hela-varlden
> 
> (Not sure if this is the correct bug to report playback results in, feel
> free to redirect me)

I'm also on 10.10 DP6 but I only get a black screen (audio works) when trying to playback videos (h264) on youtube. Do you use a special setting?
(In reply to beingalink from comment #85)
> (In reply to Per Olofsson from comment #82)
> > Tested the nightly on 10.10 DP6 and it happily plays videos on youtube and
> > vimeo, great job! Unfortunately it fails to play on SVT Play, which works in
> > Safari: http://www.svtplay.se/video/318873/testvideo-synlig-i-hela-varlden
> > 
> > (Not sure if this is the correct bug to report playback results in, feel
> > free to redirect me)
> 
> I'm also on 10.10 DP6 but I only get a black screen (audio works) when
> trying to playback videos (h264) on youtube. Do you use a special setting?

I am on 10.9.4 and having the same - black screen but sound on youtube.
> I'm also on 10.10 DP6 but I only get a black screen (audio works) when
> trying to playback videos (h264) on youtube. Do you use a special setting?

No special setting, freshly deployed OS and virgin user account.
Do the patches for this bug ever result in the OpenH264 plugin getting used?  Or are they only for using Apple's built-in H.264 support?
Only Apple's built-in H.263 and AAC support. This bug has nothing to do with OpenH264.
See Also: → 1062654
(In reply to Ralph Giles (:rillian) from comment #89)
> Only Apple's built-in H.263 and AAC support. This bug has nothing to do with
> OpenH264.

Did you mean h.264 instead of h.263?
I did, sorry. We don't support h.263 in <video>.
I'm not sure if this is the right place but this link pointed me to this bug: https://developer.mozilla.org/en-US/docs/Web/HTML/Supported_media_formats

I'm having some issues with MP4:s that we are using in a current project. Our videos have a solid white background but in OS X Nightly (as of 2014-10-02) the background comes out as light gray. The same video works correctly in Chrome and Safari.

Screenshot of how it looks in Chrome: http://cl.ly/XqAI

Screenshot of how it looks in Nightly: http://cl.ly/XqDx

These screenshots are of the attached test-case.
Hi Johan. Thanks for the report. That sounds like an issue. I wonder if you're getting video-range instead of full-range output. What MacOS X versions is this?

In general it's best to open a new bug for issues you find. Do you mind doing that here? Click on the 'new' link at the top of the page and enter it under the Core 'Video/Audio' component.
See Also: → 1077277
You need to log in before you can comment on or make changes to this bug.