Closed Bug 422540 Opened 16 years ago Closed 12 years ago

GStreamer backend for HTML5 video element

Categories

(Core :: Audio/Video, enhancement, P1)

enhancement

Tracking

()

RESOLVED FIXED
mozilla14
Tracking Status
fennec 1.0- ---

People

(Reporter: cajbir, Assigned: alessandro.d)

References

Details

(Keywords: mobile)

Attachments

(6 files, 57 obsolete files)

14.83 KB, text/plain
Details
83.21 KB, text/plain
Details
112.26 KB, text/plain
Details
11.11 KB, text/plain
Details
126.11 KB, image/png
Details
48.41 KB, patch
Details | Diff | Splinter Review
Bug 382267 is being restructured to not contain implementations of decoders for
video codecs and to allow different backend decoder implementations. 

Implement a backend that uses GStreamer to decode audio/video, allowing support of all the video formats that the user has GStreamer plugins for.
Attached patch First cut at gstreamer playback (obsolete) — Splinter Review
You'll need to apply the patch from 382267 first. This applies over the top of that. Once applied <video> will work with any video format that gstreamer has plugins for. Tested with standard Ogg streamer plugin and fluendo h.264 plugin.

Seeking works via changing the currentTime property of the video object but there is no standard controls or progress events for it yet.
Updated to work with latest bug 382267 patch
Attachment #313256 - Attachment is obsolete: true
marking wanted1.9.1? to get this in the triage queue.  If this needs to be blocking1.9.1?, please mark it as so.
Flags: wanted1.9.1?
Priority: -- → P1
Flags: wanted1.9.1? → wanted1.9.1+
Updated to work with changes in bug 382267
Attachment #322882 - Attachment is obsolete: true
Hi, Double
  It is really interesting work! I applied this version3 patch on windows ,built it in cygwin, then installed corresponding gstream component, it works! thanks for your hard work! but there is a big performence problem on win32 platform, I found the cpu load is very heavy even when I play a very low-density video, and I have no idea about performance on linux. 
  I read and tested the code and found it is frame->Invalidate(r, PR_FALSE) call that consume most cpu, I wander if there is some skill can improve the performance ? I am not really familiar with the mozilla's archetecture and just start to learn it. so can you give me some clue about the performance? 

Thanks!
Component: DOM: HTML → DOM: Core & HTML
Any plans on committing/enabling for Linux nightlies now that bug 382267
patches are in? Audio for me right now is very messed up, and I suspect this patch will sort those issues out. Thanks!
Component: DOM: Core & HTML → Video/Audio
QA Contact: general → video.audio
This updates to trunk. Since trunk now has the Ogg decoder some thought needs to go into how to support multiple backends. I punt on this issue for this patch and always instantiate gstreamer for <video src="...">, but instantiate Ogg if there is a <source> with a type of application/ogg, etc.
Attachment #327910 - Attachment is obsolete: true
patch fails to build here with
nsHTMLMediaElement.cpp: In member function ‘nsresult nsHTMLMediaElement::PickMediaElement(nsAString_internal&)’:
nsHTMLMediaElement.cpp:599: error: cannot allocate an object of abstract type ‘nsGStreamerDecoder’
../../../../dist/include/content/nsGStreamerDecoder.h:65: note:   because the following virtual functions are pure within ‘nsGStreamerDecoder’:
../../../../dist/include/content/nsVideoDecoder.h:77: note: 	virtual nsIPrincipal* nsVideoDecoder::GetCurrentPrincipal()
nsHTMLMediaElement.cpp:639: error: cannot allocate an object of abstract type ‘nsGStreamerDecoder’
../../../../dist/include/content/nsGStreamerDecoder.h:65: note:   since type ‘nsGStreamerDecoder’ has pure virtual functions
make[6]: *** [nsHTMLMediaElement.o] Error 1
Attached patch Update to trunk (obsolete) — Splinter Review
Update to build and run on trunk. This patch needs some work, which it'll be getting post 3.1.
Attachment #336911 - Attachment is obsolete: true
tracking-fennec: --- → 1.0+
Attached file Work in progress on GStreamer (obsolete) —
Plz. contact me if you will look at this, there are a number of issues that you should be aware of - it is "work in progress"
Same comment as for the previous one, plz. contact me if you want to have a look at this, there are several issues :)
Attachment #376425 - Attachment is obsolete: true
Is there a working try server build?
(In reply to comment #13)
> Is there a working try server build?

Sorry, no try server build.  The status of the change is that it has value from a source code perspective, not as code that actually runs in a real device.
Assignee: chris.double → mkristoffersen
Attachment #376428 - Attachment is patch: true
If you actually try to watch a video with this code, your browser might hang
This is still mostly interesting from a source code point of view - The code doesn't do much good in the target.
Attachment #345597 - Attachment is obsolete: true
Attachment #376428 - Attachment is obsolete: true
Attachment #380178 - Attachment is obsolete: true
(In reply to comment #18)
> Created an attachment (id=381129) [details]
> Dump where audio and video is played back when running fennec on the pc
> (doesn't work in target)
> 
> This is still mostly interesting from a source code point of view - The code
> doesn't do much good in the target.

Oh yes, and still only expect a single video to be played, between each launch of fennec.

And... add "ac_add_options --enable-gstreamer" to your mozconfig in order to enable the code - thanks to blassey for providing the build system fix :)
This version has partial support for playback in target, the usual stuff (no audio in target version, only one playback pr browser launch etc.) applies
Attachment #381476 - Attachment is patch: true
Attachment #381476 - Attachment mime type: application/octet-stream → text/plain
The usual stuff applies - e.g. no playback in target, hard-coded to start playback - work in progress
Attachment #381129 - Attachment is obsolete: true
Attachment #381476 - Attachment is obsolete: true
Attached patch Latest GStreamer (obsolete) — Splinter Review
Updated patch so it's easier to apply.
Thanks for removing bitrot Mike.
Attachment #382679 - Attachment is obsolete: true
Fubar'd the last patch this should apply everything needed now, and works against trunk.
Attachment #382994 - Attachment is obsolete: true
Attachment #383183 - Attachment is patch: true
Attachment #383183 - Attachment mime type: application/octet-stream → text/plain
This patch is functionally like the others, except an extra buffer has been added to the audio stream.
The stream input and input buffer has been collected in a bin.
The audio and video sinks moved to their own individual bins. (will now be more easy to add volume control element etc.)
Code has been split up into several individual classes to improve sanity and structure - making it less scary to fiddle with the individual pieces.

Work in progress still need to eliminate about 7 functions and move about 16 before the refactoring is complete.

Unless you have a need to actually modify the gstreamer integration code, please continue to use patch 383183 - I'll post an updated version to replace that patch ones the refactoring is complete and I have moved to TOT.
Attachment #383996 - Attachment is patch: true
Attachment #383996 - Attachment mime type: application/octet-stream → text/plain
Added buffer for video stream, fixed ref-count issue for GStreamer elements
Attachment #383183 - Attachment is obsolete: true
Attachment #383996 - Attachment is obsolete: true
Attached patch Now with all the files (obsolete) — Splinter Review
Forgot to add the new files
Attachment #384400 - Attachment is obsolete: true
applying the latest patch I get:

applying diff_nsMediaStream
unable to find 'content/media/video/public/Makefile.in' for patching
1 out of 1 hunks FAILED -- saving rejects to file content/media/video/public/Makefile.in.rej
unable to find 'content/media/video/public/nsMediaDecoder.h' for patching
1 out of 1 hunks FAILED -- saving rejects to file content/media/video/public/nsMediaDecoder.h.rej
unable to find 'content/media/video/src/Makefile.in' for patching
1 out of 1 hunks FAILED -- saving rejects to file content/media/video/src/Makefile.in.rej
patch failed, unable to continue (try -v)
content/media/video/public/Makefile.in: No such file or directory
content/media/video/public/nsMediaDecoder.h: No such file or directory
content/media/video/src/Makefile.in: No such file or directory
content/media/video/src/Makefile.in not tracked!
content/media/video/public/nsMediaDecoder.h not tracked!
content/media/video/public/Makefile.in not tracked!
patch failed, rejects left in working dir
The content/media directory was restructured in bug 499880. This landed on trunk yesterday. The files have changed directory.
I'm testing a patch now to update to tip. I have created a gstreamer directory at content/media/gstreamer, it seemed to fit with the scheme you guys are adopting. Let me know if that's utter non-sense and I'll refactor. Otherwise I'll update with a patch when I know it's building/running right
Attached patch Updated to tip (obsolete) — Splinter Review
I have updated the patch. Check my previous comment regarding directory structure. Not sure if I made the right choice here. Otherwise this builds.
Attachment #384974 - Attachment is obsolete: true
Directory structure looks good.
Fits the same changelist as the previous patch, it hasn't been tested on anything but a single mpeg, but is never the less an important step :) (and btw skipping frames even the algorithm isn't quite right yet)

I have noticed a few times that the audio might start to play back for 0..n sec before the first frame of the video is shown... and btw give it time, it can look stuck at times, and don't try with anything but mpegs version 1 or 2 (or you will be stuck forever and ever and....)
Attachment #385894 - Attachment is obsolete: true
Note that audio is down (again) on the N810 with this patch
Attachment #388236 - Attachment is obsolete: true
No longer blocks: 451674
Attachment #390515 - Attachment is patch: true
Attachment #390515 - Attachment mime type: application/octet-stream → text/plain
Attachment #390515 - Flags: review?(chris.double)
NOTE: The review requested should be seen as a preliminary one, Chris has received an e-mail with the details, (known issues and specific questions)
This patch is not compliant with Mozilla coding standards - especially the GStreamer sink element is still close to its original non-Mozilla form - I would like to keep it this way for a little to limit the merging task for me and the GStreamer guy who created the original element - when the element is stable I'll convert it to Mozilla style - might even consider removing the goto's in there...

I didn't mark this patch as a obsoleting the previous one, as the previous one is significantly more stable than this one - This patch earns its right as it has the correct video sink element and has significantly higher framerate on the N810.
A lot of mode diffs in this diff - if anyone knows how to get rid of them, let me know :)
Attachment #390515 - Attachment is obsolete: true
Attachment #394590 - Attachment is obsolete: true
Attachment #390515 - Flags: review?(chris.double)
The patch is working as expected on N810 there is something fundamental wrong on the N900 (as in "video doesn't work", at least on my device)
Attachment #400466 - Attachment is obsolete: true
Attachment #403317 - Flags: review?(chris.double)
Comment on attachment 403317 [details] [diff] [review]
Fits changeset: 33238:e915fafc9ba5 (Morning of Sep 28)

Not expected to pass review, as agreed, review requested to get feedback on what needs to be fixed before review can be approved for initial submit.
Attached patch Suspend/resume partly working (obsolete) — Splinter Review
Attachment #403317 - Attachment is obsolete: true
Attachment #404113 - Flags: review?
Attachment #403317 - Flags: review?(chris.double)
Attachment #404113 - Flags: review? → review?(chris.double)
(In reply to comment #45)
> Created an attachment (id=404113) [details]
> Suspend/resume partly working

btw - sorry for the mode changes in the patch, they shouldn't be there
Fits change: 33827:f1644b493ff5
Still has an issue resuming a specific video, where the audio playback is failing during normal playback when I test on PC.
Attachment #404113 - Attachment is obsolete: true
Attachment #404113 - Flags: review?(chris.double)
Attachment #406209 - Flags: review?(chris.double)
The patch in attachment 406209 [details] [diff] [review] fails to apply.
There seem to be a bunch of whitespace only changes in files like nsCanvasRenderingContext2D.cpp. Can you generate a patch without these whitespace diffs to make reviewing easier?
Comment on attachment 406209 [details] [diff] [review]
Now with suspend-resume working for suported videos

// Part 1
+
+  // Check if any decoders knows about the mime type

Fix grammar

+  // We have not found a decoder yet, so make sure we don't think so
   
This comment isn't very well worded. Can you change it?

diff --git a/content/media/gstreamer/Makefile.in b/content/media/gstreamer/Makefile.in
+++ b/content/media/gstreamer/Makefile.in
+# Contributor(s):
+#  Chris Double <chris.double@double.co.nz>

Add your name as a contributor.

+PRBool nsGStreamerAudioBin::Create()
+{
  ...
+
+  mAudioBin = GST_BIN_CAST(gst_bin_new("mAudioBin"));
+

This can fail and mAudioBin is later used a few lines down in gst_object_sink. Detect and handle the failure here. Same with the following gst_element_factory_make calls.

+  // Create a buffer for the audio => Running in separate thread

What does the 'running in separate thread' mean? How does the call to gst_element_factory_make cause it to run in a different thread?

+  // Find the best possible sink
+  if (!mAudioSink) {
+    mAudioSink = gst_element_factory_make("gconfaudiosink", "mAudioSink");
+  }
+  if (!mAudioSink) {
+    mAudioSink = gst_element_factory_make("pulsesink", "mAudioSink");
+  }
...etc...

Can we make these sinks, and the order they're tried, a pref? Maybe a comma separated list of sink names in the order they should be tried.

+  // Now stuff gets a little complicated - cause on the N810 our only choice for
+  // audio sink is the "autoaudiosink" which we at this time can't add to
+  // our bin, as we need to set it's filter property before we enter the pause
+  // state.

"its" not "it's"

+  //
+  // So that's the reason we in the below code treat mAudioSink a litle special

little

+  // Take ownership of the elements
+  gst_object_sink(gst_object_ref(GST_OBJECT(mAudioBin)));
+  gst_object_sink(gst_object_ref(GST_OBJECT(mAudioBuffer)));
+  gst_object_sink(gst_object_ref(GST_OBJECT(mAudioConvert)));
+  gst_object_sink(gst_object_ref(GST_OBJECT(mAudioSink)));
+  gst_object_sink(gst_object_ref(GST_OBJECT(mAudioVolume)));

Can we take ownership of these in the same place that we create them? 

+          guint64 max_buffer_time_in_ns   = 10;     // 10 seconds
+                  max_buffer_time_in_ns  *= 1000;
+                  max_buffer_time_in_ns  *= 1000;
+                  max_buffer_time_in_ns  *= 1000;
+          guint64 min_threshold_in_ns   = 2;        // 2 seconds
+//                  min_threshold_in_ns  *= 1000;
+//                  min_threshold_in_ns  *= 1000;
+//                  min_threshold_in_ns  *= 1000;

Why are these 3 lines for  min_threshold_in_ns commented out, but the max_buffer_time_in_ns is not?

+    // TODO: Do error handling / cleanup

Needs to be done for review...

+void nsGStreamerAudioBin::Destroy()
+{
+  if (mAudioBin) {
+    gst_object_unref(GST_OBJECT(mAudioBin));
+  }
+  if (mAudioSink) {
+    gst_object_unref(GST_OBJECT(mAudioSink));
+  }
+  if (mAudioConvert) {
+    gst_object_unref(GST_OBJECT(mAudioConvert));
+  }
+  if (mAudioBuffer) {
+    gst_object_unref(GST_OBJECT(mAudioBuffer));
+  }
+  if (mAudioVolume) {
+    gst_object_unref(GST_OBJECT(mAudioVolume));
+  }

gst_object_unref is safe to be called with a null object you don't need the tests.

+    // Try to move the audio sink into the ready state
+    gst_element_set_state(mAudioSink, GST_STATE_PAUSED);

Handle or log failure of the state fails to be set.

+////////////////////////////////////////////////////////////////////////////////
+// Class that takes care of the audio bin
+

Add comments describing what components the audio bin is composed of and how they are linked together.
+#ifndef nsGStreamerAudioPropertiesInterface_h__
+#define nsGStreamerAudioPropertiesInterface_h__
+
+////////////////////////////////////////////////////////////////////////////////
+// Interface class to modify audio properties
+

Describe in more detail what this class is used for. Who should inherit from it? Who calls it? Does anyone maintain a reference to it - if so, who controls ownership of it. Will an instance of this class ever have delete called on it from an nsGStreamerAudioPropertiesInterface pointer? If so, does it need a virtual destructor?

+#ifndef nsGStreamerConfig_h__
...
+// Define the max size of video elements in pixels
+#define MAX_VIDEO_WIDTH 1024
+#define MAX_VIDEO_HEIGHT 1024

Where did these numbers come from? Does this mean 1080P HD video won't play?

+// Define how much data we should try to read at a time
+#define GST_BUFFER_READ_CHUNK_SIZE 1024*16

What's the origin of this size? Maybe comment about why you chose that value.

From nsGStreamerDecodeBin.cpp:

+nsGStreamerDecodeBin::nsGStreamerDecodeBin()
+  : mDecodeBin(nsnull)
+{
+}

Why is the other member variable, mPadDest, not set to an initial value?

+PRBool nsGStreamerDecodeBin::Create(nsGStreamerDecodeBinPadInterface *aPadDest)
+{
...
+  // If we got the decode bin, then configure it
+  if (mDecodeBin) {
+    // The decodebin won't make any output pads, until it knows something about
+    // it's input data since it only at that time can determine the kind of
+    // output, if any, it can generate

"its" not "it's"

+    // Constructor
+    // Destructor

Comments like this aren't really needed. It's obvious when it's a constructor or a destructor.
// Part 3
+++ b/content/media/gstreamer/nsGStreamerDecoder.h
...
+ * Contributor(s):
+ *  Chris Double <chris.double@double.co.nz>

Add your name as a contributor.

+class nsGStreamerDecoder : public nsMediaDecoder
...
+  // Function to create "new" an instance

This comment needs rephrasing to be more readable.

+  // It's done in this way to allow GStreamer data types in the "real" class
+  static nsGStreamerDecoder *makeNewInstance();

I'm not sure what you mean by "to allow GStreamer data types in the 'real' class". Can you expand on this?

+
+////////////////////////////////////////////////////////////////////////////////
+// Class that takes care of communication with the media element

Go into more detail about why this class exists. I presume it's something to do with thread safety?

+    // Constructor
+    // Virtual destructor

Comments like this aren't needed.

Add comments to the methods to explain what threads they can safely be called on. Add assertions to the methods that are main thread only to ensure they are called on the main thread only. eg:

NS_ASSERTION(NS_IsMainThread(), "Should be called on main thread");


+    nsHTMLMediaElement* mElement;

Add a comment about what threads can use mElement (ie. main thread only?). This is not a counted pointer - how is ownership of mElement maintained? For an example of the type of comment needed, see the 'mElement' comment in nsMediaDecoder.h which describes the thread usage and why it's not reference counted.
+++ b/content/media/gstreamer/nsGStreamerInputBin.cpp
@@ -0,0 +1,268 @@
+nsGStreamerInputBin::nsGStreamerInputBin()
+  : mInputBin(nsnull),
+    mStreamSrc(nsnull),
+    mInputBuffer(nsnull),
+    mElementWrapper(nsnull)
+{
+}

mBufStatusInterface is missing from the list of things set to nsnull.

+PRBool nsGStreamerInputBin::Create(nsMediaStream *aMediaStream,
+    nsGStreamerInputBufferStatusInterface* aBufStatusInterface)
+{

Everything I've said about previous ::Create implementations should be addressed here too.

+      && mStreamSrc
+      && mStreamSrc

mStreamSrc listed twice.

+    const guint max_buffer_buffers = 0;
+    const guint max_buffer_size_in_bytes = 16*1024;   // 512*1024
+    const guint64 max_buffer_time_in_ns  = 0;
+    const guint min_threshold_in_bytes = 0;           // 512*1024
+    const guint64 min_threshold_in_ns = 0;
+    const guint min_threshold_buffers = 0;

How did you decide on what numbers to use here? 

+    gpointer gvStatusInterface = (nsGStreamerStreamStatusInterface *)this;

Use C++ casts.

diff --git a/content/media/gstreamer/nsGStreamerInputBin.h b/content/media/gstreamer/nsGStreamerInputBin.h
+
+////////////////////////////////////////////////////////////////////////////////
+// Class that takes care of the input buffer
+

Expand on this. Where is it used, what for. How does it fit into the overall architecture of the GStreamer decoder.

+    // Function to create an inputbuffer
+    // aChannel [in]: channel for incoming data
+    // aStreamListener [out]: pointer to IStreamListener
+    PRBool Create(nsMediaStream *aMediaStream,
+                  nsGStreamerInputBufferStatusInterface* aBufStatusInterface);

The comment makes no sense. Where is 'aChannel' and 'aStreamListener'?

+
+    nsGStreamerInputBufferStatusInterface* mBufStatusInterface;

Needs a comment to say what this is for, who owns it, what threads it can be used on, etc.

+    //Takes care of element com.
+    nsRefPtr<nsGStreamerElementWrapper> mElementWrapper;

What does this mean? What is 'com'?

+    // Called to inform that EOF has been reached
+    virtual void EOFReached();

Called by whom?

+++ b/content/media/gstreamer/nsGStreamerInputBufferStatusInterface.h
+
+    // Called to inform state machine of buffer status

What state machine?

+++ b/content/media/gstreamer/nsGStreamerLib.cpp
@@ -0,0 +1,243 @@
+
+void nsGStreamerLib::BringGStreamerDown()
+{
+  switch (mGStreamerLibStatus)

Reading mGStreamerLibStatus without taking the mGStreamerMutex lock. The switch also reads mNumberOfInstances.

+  // The gstreamer lib is now down
+  mGStreamerLibStatus = GSTREAM_GDOWN_LIB_DOWN;

This should be protected by the lock?

+////////////////////////////////////////////////////////////////////////////////
+// Class that handels the GStreamer library,
+// registration of plugins etc. MT safe

This class needs more comments to say which methods must have the lock obtained before calling them. For example, the implementation obtains the lock then calls BringGStreamerDown(). The coment for BringGStreamerDown should say that it must be called with the lock obtained. Ideally the implementation would assert this fact.

+
+    // The last thing we should do, befor termination of the program
+    // is to terminate the GStreamer library
+    class Terminator
+    {
+      public:
+        ~Terminator();
+    };
+
+    static Terminator terminator;

Why not do this in the destructor of nsGStreamerLib instead of having the helper object?

+++ b/content/media/gstreamer/nsGStreamerLink.cpp

+    // Only shut it down ones

"once" not "ones"

+////////////////////////////////////////////////////////////////////////////////
+// Class that links everything GStreamer related together
+
+class nsGStreamerLink : public nsGStreamerDecoder

Why is this a derived class of nsGStreamerDecoder? Why not put all this in nsGStreamerDecoder itself? Nothing else derives from nsGStreamerDecoder?

Please describe what members and methods are main thread only, and what can be called on any thread. Describe thread safety requirements.
+++ b/content/media/gstreamer/nsGStreamerPipeline.cpp
+
+nsGStreamerPipeline::nsGStreamerPipeline()
+  : mStreamPipeline(nsnull),
+    mAudioPadSource(nsnull),
+    mAudioPadSink(nsnull),
+    mVideoPadSource(nsnull),
+    mVideoPadSink(nsnull)
+{
+}

Some member variables are not being set (eg. mStreamerDecoder)

+gboolean nsGStreamerPipeline::BusCallBack(GstBus *aBus,
+                                          GstMessage *aMessage,
+                                          nsGStreamerPipeline *aMe)
+{

Is this called on a non-main thread? If so how is thread safety assured when using member varaibles like:

+      aMe->mGStreamerState.PositionAtEOF();

+////////////////////////////////////////////////////////////////////////////////
+// Class that takes care of the GStreamer pipeline
+

Do any of the static member functions in this class get called on different threads? If so, how is access to methods/members done safely on the different threads? Please add comments about which members are main thread only or safe to read and write on other threads.

+++ b/content/media/gstreamer/nsGStreamerPipelineState.h
+////////////////////////////////////////////////////////////////////////////////
+// Class that takes care of the GStreamer pipeline state

Expand on where this class is used, what it is for, what threads it can safely be used on, etc. Comment all methods/members that can or cannot be called on other threads.

+    // Destructor
+    ~nsGStreamerPipelineState();

Does this need to be virtual? Is it ever deleted using a pointer to one of its base classes?

+    // Tell state machine if it should play or stop(pause)
+    // bPlay = PR_TRUE => playback starts when conditions are right
+    // bPlay = PR_FALSE => playback is paused and will not start
+    virtual void PlayWhenPossible(PRBool aShouldPlay);

bPlay should be aShouldPlay?

+++ b/content/media/gstreamer/nsGStreamerVideoBin.cpp
+
+PRBool nsGStreamerVideoBin::Create(
+    nsGStreamerVideoPrerollCompleteInterface *aVideoPrerollComplete)
+{

Everything I've said in other ::Create methods should be done here too.

+    aMe->mAspect = (pixel_aspect[0] == 0) ?
+      ((gdouble)aMe->mWidth/(gdouble)aMe->mHeight) :
+      ((gdouble)pixel_aspect[0] /
+       (gdouble)pixel_aspect[1] *
+       (gdouble)aMe->mWidth/(gdouble)aMe->mHeight);
...
+    aMe->mFramerate = (gdouble)pixel_aspect[0]/(gdouble)pixel_aspect[1];

Can pixel_aspect[1] ever be zero? If so, guard against division by zero.

+++ b/content/media/gstreamer/nsGStreamerVideoBin.h
+
+////////////////////////////////////////////////////////////////////////////////
+// Class that takes care of the video sink bin
+

Same documentation requirements as mentioned previusly.

+    // Mutex to control the access to mIsScreenPreviouslyInvalidated
+    GMutex *mScreenInvalidMutex;
+
+    // Condition that is set when screen has been invalidated
+    GCond *mScreenInvalidatedCond;

Why not use NSPR locking types?
Somewhere, perhaps nsGStreamerDecoder.h, I'd like to see some comments describing the overall architecture of the GStreamer decoder. How all the classes work together, what the lifetime of all the various objects are, and what threads are used. As an example see nsOggDecoder.h. Something like the ascii diagram in that that maps the various states of the gstreamer decoder would be nice as well.

I'll wait till the patch is cleaned up before reviewing further since it's a hard to tell in the 15,000 line patch where actual code changes are vs white space changes.
Attachment #406209 - Flags: review?(chris.double) → review-
There needs to be a way of testing this backend. If the mochitests are run will this only use the Ogg backend for the ogg files rather than the GStreamer backend? If so you'll need to add test files that GStreamer will use. Do the tinderboxes have GStreamer installed?
It would be best if it was possible to run with GStreamer enabled for Ogg and Wave, so we can check that it passes all the existing tests.
As you know, the patch isn't complete yet, and hence wont pass any good testing - but I'm fine with changing it so it will be possible to use the existing tests to test it (at least partial) - to use GStreamer to run Ogg tests I guess we could just disable the "native" OGG decoder?  

I believe this can currently be done compile time - I assume that won't be good enough - will it be an idea to be able to turn the native ogg decoder and gstreamer on and off through the runtime system as well as compile time?
I think compile time is good enough. If --disable-ogg --disable-wave --enable-gstreamer are given, then the GStreamer backend should be able to play Ogg and Wave files.
(In reply to comment #59)
> I think compile time is good enough. If --disable-ogg --disable-wave
> --enable-gstreamer are given, then the GStreamer backend should be able to play
> Ogg and Wave files.

I'm pretty sure this will work the way things are #if'd in the patch.
It is definitely made with the intention that disabling ogg compile time will enable gstreamer :)  So if it doesn't work that way it should be fixed to work that way.

I'll not do anything for making it runtime selectable then.

Came to think of, is it fine that GStreamer is enabled by a define, and ogg and wave are disabled by a define - or should it be similar for all?

Another question is:

Assuming the first version of this patch is going to be submitted without being functional complete or perfect - which things are a "must do" before it can be submitted?
I tried last patch on N900 device, 
<video autoplay="true" src="star_trek_720_400_2100kbps.avi" controls="true"/>

And it does not work very well.
it is end with 
dsp_thread: failed waiting for events
Only first frame shown on the page, and nothing more happening
(In reply to comment #61)
> Came to think of, is it fine that GStreamer is enabled by a define, and ogg and
> wave are disabled by a define - or should it be similar for all?

I think that's fine.

> Assuming the first version of this patch is going to be submitted without being
> functional complete or perfect - which things are a "must do" before it can be
> submitted?

That's a good question. Perhaps the best approach is to say that code that has been reviewed can land, even if it doesn't work.
(In reply to comment #62)
> Created an attachment (id=408675) [details]
> Output log for ms-video gstreamer play
> 
> I tried last patch on N900 device, 
> <video autoplay="true" src="star_trek_720_400_2100kbps.avi" controls="true"/>
> 
> And it does not work very well.
> it is end with 
> dsp_thread: failed waiting for events
> Only first frame shown on the page, and nothing more happening

Interesting... Had a look at the log, nothing springs out as an explanation -
are you running with the source file locally or from the web?  

Did you try to play the .avi file back with the build in player on the device?

Is the file generating this behaviour publicly available?
(In reply to comment #63)
> That's a good question. Perhaps the best approach is to say that code that has
> been reviewed can land, even if it doesn't work.

I'm not sure I understand your meaning with the above? - I'm seeing two extreme interpretations:

1) The code can be submitted "as is" as long as we have a list of the things that must be fixed in it later (could be ok, as it won't be enabled unless the build options are changed to include --enable-gstreamer) but there could still be stuff that _must_ be fixed before a submit - like if I mess up "public" files....

2) Things caught in the review needs to be fixed, but the review is not concerned with the patch working or not.

As I said, I didn't quite understand the comment, so the right answer could be something completely different...
(In reply to comment #65)
> (In reply to comment #63)
> 2) Things caught in the review needs to be fixed, but the review is not
> concerned with the patch working or not.

The patch needs to get r+ which means all review comments need to be addressed. Given that the gstreamer support will be disabled on landing then it doesn't need to be complete. I'd hope that videos can at least play though (seeking, etc not so important).
(In reply to comment #65)
> 2) Things caught in the review needs to be fixed, but the review is not
> concerned with the patch working or not.

This is what I meant.
(In reply to comment #67)
> (In reply to comment #65)
> > 2) Things caught in the review needs to be fixed, but the review is not
> > concerned with the patch working or not.
> 
> This is what I meant.

Thanks - I don't understand it, but I can comply to it :)  

I would be interested in discussing this to improve my understanding of the development process in Mozilla next time we meet in person - maybe in December?
There's no policy here, just one-off decision making.

I think generally we want code that's in our tree to have been reviewed and not require more review before it gets enabled for real. Otherwise it's too difficult to track what reviews need to be done, and to actually get people to do the reviews and then address the review comments --- normally, getting your code on trunk is the reward that encourages you to go through the review process :-).

However, apart from that, we want your code on trunk as soon as possible. So it's OK if it's incomplete, as long as we understand what's there and what's missing.
> Interesting... Had a look at the log, nothing springs out as an explanation -
> are you running with the source file locally or from the web?  

source file and video.html file from web.

> 
> Did you try to play the .avi file back with the build in player on the device?

Yes, it one of pre-installed videos, and it works fine with builtin player.

> 
> Is the file generating this behaviour publicly available?
Not sure, I think you can try the same with some official preinstalled video from N900 device.
(In reply to comment #69)
> There's no policy here, just one-off decision making.

Which must also reflect some kind of policy, but never mind that now...

> I think generally we want code that's in our tree to have been reviewed and not
> require more review before it gets enabled for real. Otherwise it's too
> difficult to track what reviews need to be done, and to actually get people to
> do the reviews and then address the review comments --- normally, getting your
> code on trunk is the reward that encourages you to go through the review
> process :-).

Yes, I agree with that, if we let anything into the trunk of sub-standard, then there is a very real risk that it will stay that way and not be fixed - but it is not always what I think that is the "Mozilla-way" thats why I have to ask :) 

> However, apart from that, we want your code on trunk as soon as possible. So
> it's OK if it's incomplete, as long as we understand what's there and what's
> missing.

I think what I don't understand is where that hair fine line is - between what is "incomplete" and what is an "error" but luckily in this particular case that is not my problem as that will be up to the reviewer :)
(In reply to comment #70)
Thanks, I'll try to reproduce it and find an explanation, it might be a while thou...
Also with default Nokia_N900.mp4 video, behavior is not very good.
For whole video (1:15) it show ~3-5 frames, ~0.06 FPS.
(In reply to comment #73)
> Also with default Nokia_N900.mp4 video, behavior is not very good.
> For whole video (1:15) it show ~3-5 frames, ~0.06 FPS.

Ahh performance - it is... should we say... an issue :)  

The GStreamer integration in the media framework is mainly a question of feeding it data in one end and removing decoded video frames in the other end so not much space for optimization there, there might be some performance boost by doing better synchronisation between the decoding and the actual rendering but my feeling is that the best path for optimisation is to optimise the surroundings.  

I have tried creating a small stand-alone program that only has the GStreamer integration pushing the data directly to an X-Window and using a file source for the data - this has significant better performance in target than what is seen when we run it with the overhead of Fennec :(  - The plan is to be feature complete before I spend time optimizing the surroundings of the integration.
I don't see how the "browser overhead" can account for a 400X slowdown. Something must be fundamentally broken.
What is the CPU usage of fennec when it's playing a video on the N900?
(In reply to comment #75)
> I don't see how the "browser overhead" can account for a 400X slowdown.
> Something must be fundamentally broken.

No, one shouldn't think browser overhead could account for that...  and yes,
something seems to be broken :)
(In reply to comment #76)
> What is the CPU usage of fennec when it's playing a video on the N900?

I'm not spending time on optimization at this point, so I won't be able to tell :)
Depends on: 525895
Depends on: 525898
Depends on: 525923
Depends on: 525946
Depends on: 525949
Depends on: 525951
Depends on: 525956
Depends on: 525958
Depends on: 525961
Depends on: 525964
Depends on: 525966
Depends on: 525968
Depends on: 525969
Depends on: 525970
Depends on: 525975
I think we should also request list of supported mime-types from gstreamer, update Gecko-Content-Viewers, and make possible to open video supported by gstreamer as MediaDocument
If getting the list of supported mime-types from gstreamer is possible then can it be used to properly implement canPlayType? In the patch that's currently set to return 'maybe' for GStreamer iirc.
You can get a list of GStreamer's "caps types" from gstreamer, if you want. It's of dubious utility to do so.

These look like, and in some cases are the same as, mime types. They are NOT in general mime types, and should NOT be used as mime types.

There's no way to get a list of supported mime types from GStreamer. You could hard-code a mapping table from caps types to real mime types, and on canPlayType() respond appropriately if the type is in your table.

Of course, this too is of dubious utility, since mime types on media files typically describe the container format, and "what you can play" is more typically tied to the codecs used inside that container.
RFC 4281 describes how to include codec information in MIME types. We support this for our built-in codecs.
Ah, right, I'd forgotten about that.

You could certainly do something where you query gstreamer for demuxers, then decoders, then have some big messy table for mapping a mixture of that info into mime-types-with-codec information.

There's not currently any way to find out complete detail about what a gst decoder supports - e.g. for an mpeg4 part 2 decoder, it won't tell you what profiles/levels are available. That could probably be added to gstreamer in the future, though. Realistically, nobody's going to provide that level of detail when using canPlayType anyway, so this is not a big loss.
(In reply to comment #50)

This post is a reply to Comment #50 and #51

These are comments to the review comments, the things I'm not commenting on has
either been fixed as requested, or have been changed in a way that renders the
comment irelevant.

----

It would be helpfull if filename/linenumber was included for every review
comments.

----

nsHTMLMediaElement.cpp (1243)

// Check if any decoders knows about the mime type

Fix gramma

^^^^
Do you mean know instead of knows?

----
nsGStreamerAudioBin.cpp (70)

+  mAudioBin = GST_BIN_CAST(gst_bin_new("mAudioBin"));
+

This can fail and mAudioBin is later used a few lines down in gst_object_sink.
Detect and handle the failure here. Same with the following
gst_element_factory_make calls.

^^^^

It is true that the lines like:

gst_object_sink(gst_object_ref(GST_OBJECT(mAudioBin)));

are using the members before they are checked - it should really not happen in
the normal case and the above sequence can handle a 0 as input (it will
complain to the console thou).

So the postponing of the error handling is intentional...

Adding checks before these lines will complicate the error handling without any
benefit, and decrease the readability of the function significately, so as
there are no real benefit of doing the
check early, I'll leave it as is - unless I'm told a good reason not to.

-----

nsGStreamerAudioBin.cpp (73)

+  // Create a buffer for the audio => Running in separate thread

What does the 'running in separate thread' mean? How does the call to
gst_element_factory_make cause it to run in a different thread?

^^^^

It means that it will run the elements in the bin in its own thread.

If you want two things to run in seperate threads in GStreamer, you add a queue
(buffer) between them.

-----

nsGStreamerAudioBin.cpp (78)

+  // Find the best possible sink
+  if (!mAudioSink) {
+    mAudioSink = gst_element_factory_make("gconfaudiosink", "mAudioSink");
+  }
+  if (!mAudioSink) {
+    mAudioSink = gst_element_factory_make("pulsesink", "mAudioSink");
+  }
...etc...

Can we make these sinks, and the order they're tried, a pref? Maybe a comma
separated list of sink names in the order they should be tried.

^^^^

Sure we can, do you wish this to be done before the initial submit of the code?

-----

nsGStreamerAudioBin.cpp (115)

+  // Take ownership of the elements
+  gst_object_sink(gst_object_ref(GST_OBJECT(mAudioBin)));
+  gst_object_sink(gst_object_ref(GST_OBJECT(mAudioBuffer)));
+  gst_object_sink(gst_object_ref(GST_OBJECT(mAudioConvert)));
+  gst_object_sink(gst_object_ref(GST_OBJECT(mAudioSink)));
+  gst_object_sink(gst_object_ref(GST_OBJECT(mAudioVolume)));

Can we take ownership of these in the same place that we create them? 

^^^^
Yes we can - but do you think it will improve readability (aka how easy the
code is to understand for a 3'rd party)?

Consider:

// Create elements
A a = new A;
B b = new B;
C c = new C;

// Do something to elements
foo(a);
foo(b);
foo(c);

// Return something using all elements
return a+b+c;

Against:

// Create and do something to A
A a = new A;
a.foo();

// Create and do something to B
B b = new B;
b.foo();

// Create and do something to C
C c = new C;
c.foo();

// Return something using all elements
return a+b+c;

-----
nsGStreamerAudioBin.cpp (202)

+void nsGStreamerAudioBin::Destroy()
+{
+  if (mAudioBin) {
+    gst_object_unref(GST_OBJECT(mAudioBin));
+  }
+  if (mAudioSink) {
+    gst_object_unref(GST_OBJECT(mAudioSink));
+  }
+  if (mAudioConvert) {
+    gst_object_unref(GST_OBJECT(mAudioConvert));
+  }
+  if (mAudioBuffer) {
+    gst_object_unref(GST_OBJECT(mAudioBuffer));
+  }
+  if (mAudioVolume) {
+    gst_object_unref(GST_OBJECT(mAudioVolume));
+  }

gst_object_unref is safe to be called with a null object you don't need the
tests.

^^^^

Yes, it's true that gst_object_unref can handle a 0 pointer, like the
gst_object_ref in the create function can (with simelar logs btw) - but the
difference is that the destroy function will be called with these being 0 under
normal conditions -and that will clutter the log - whereas they in the create
function won't be 0 under normal conditions - thats the reason they are
protected by "if"'s in the destroy function and not in the create function.

----

nsGStreamerAudioBin.cpp (268)

+    gst_element_set_state(mAudioSink, GST_STATE_PAUSED);

Handle or log failure of the state fails to be set.

^^^^

Is there a special reason that you would like a test on the return value for
this particular function?   I don't see it as being more fatal for the decoder
than most of the other function calls that it make.

-----

nsGStreamerAudioPropertiesInterface.h (40)

+#ifndef nsGStreamerAudioPropertiesInterface_h__
+#define nsGStreamerAudioPropertiesInterface_h__
+
+////////////////////////////////////////////////////////////////////////////////
+// Interface class to modify audio properties
+

Describe in more detail what this class is used for. Who should inherit from
it? Who calls it? Does anyone maintain a reference to it - if so, who controls
ownership of it. Will an instance of this class ever have delete called on it
from an nsGStreamerAudioPropertiesInterface pointer? If so, does it need a
virtual destructor?

^^^^

I have added a description about lifetime etc.  and the purpose of it - but
with regards to who should inherit from it... is what you actually want a
design document? - where you can see all the classes and their relationship? 
Such a document could either be in a stand-alone file with pretty graphics or
embedded in one of the header files with square ascii graphics.

I mean the only one that should inherit from it, is the class that does it -
this is not an interface that has any relevance outside of the GStreamer media
decoder integration - non of the code has...  I don't think it will add any
value, except make a maintainance nightmare if we start to dublicate, out of
context, what is obvious from the code in header files.

-----

nsGStreamerConfig.h (47)

+// Define the max size of video elements in pixels
+#define MAX_VIDEO_WIDTH 1024
+#define MAX_VIDEO_HEIGHT 1024

Where did these numbers come from? Does this mean 1080P HD video won't play?

^^^^
These are magic numbers, there sole purpose is an attempt not to be able to
kill the browser by embedding an insane size of video on a page (1001 ways to
kill Fennec) - I have no preference for these numbers over others - would love
to have a dynamic solution that did estimates of max size based on available
memory, but that is probably for an advanced implementation.

I can see your point with the HD video you mentioned - I have increased the
values to 2048 in the patch - which is pretty much for a handheld device anyway
- maybe this is a candiate for about:config ? or at least a compile time
switch?

------

nsGStreamerConfig.h (50)

+// Define how much data we should try to read at a time
+#define GST_BUFFER_READ_CHUNK_SIZE 1024*16

What's the origin of this size? Maybe comment about why you chose that value.

^^^^

No origins, just need a size - you might remember that I asked you about a good
value for this one during our review session?

But anyway - it's one of these values that can be manipulated during an
optimizing session, to see what gives the best performance.

I have added a comment to the value that changing it could change performance
in both directions.

-----

nsGStreamerDecodeBin.cpp (47)

+nsGStreamerDecodeBin::nsGStreamerDecodeBin()
+  : mDecodeBin(nsnull)
+{
+}

Why is the other member variable, mPadDest, not set to an initial value?

^^^^
The reason mDecodeBin is initalised to null - is that an uninitialised pointer
could create problems when a call to the destroy call is made which would be
normal if anything in the creation of the pipeline fails - mPadDest don't have
this sensitivity, so there is no need to initialise it.

However if you insist I can initialise it to 0.

-----

unknown (??)

+    // Constructor
+    // Destructor

Comments like this aren't really needed. It's obvious when it's a constructor
or a destructor.

^^^^
Yes, it is obvious.  But I'm not judging for obviousness when I write comments
on my prototypes, and I prefer to have all my functions documented.

-----
(In reply to comment #84)
> Do you mean know instead of knows?

Yes. Or you can reword it however you want so it's readable.

> So the postponing of the error handling is intentional...

If it's safe, that's fine.

> 
> Sure we can, do you wish this to be done before the initial submit of the code?

I'd prefer that but if not raise a bug for this to be done and that'll be fine.

> Yes we can - but do you think it will improve readability (aka how easy the
> code is to understand for a 3'rd party)?

I personally find it more readable but it's your call if you're going to be the one maintaining it. If you strongly prefer your way, go for it.

> Yes, it's true that gst_object_unref can handle a 0 pointer, like the
> gst_object_ref in the create function can (with simelar logs btw) - but the
> difference is that the destroy function will be called with these being 0 under
> normal conditions -and that will clutter the log - whereas they in the create
> function won't be 0 under normal conditions - thats the reason they are
> protected by "if"'s in the destroy function and not in the create function.

I have no problem with this reasoning. That's fine.

> Is there a special reason that you would like a test on the return value for
> this particular function?   I don't see it as being more fatal for the decoder
> than most of the other function calls that it make.

A couple of the third party libraries in the Ogg code don't check return values of supposedly safe calls in other third party libraries that they use and this has been a constant source of problems and vulnerabilities. I'd like to err on the side of caution of checking return values of third party libraries where they document that they can fail.

> I have added a description about lifetime etc.  and the purpose of it - but
> with regards to who should inherit from it... is what you actually want a
> design document? - where you can see all the classes and their relationship? 
> Such a document could either be in a stand-alone file with pretty graphics or
> embedded in one of the header files with square ascii graphics.

I just want better comments. Feel free to include what you want and I'll let you know if it's enough.

> These are magic numbers, there sole purpose is an attempt not to be able to
> kill the browser by embedding an insane size of video on a page (1001 ways to
> kill Fennec) - I have no preference for these numbers over others - would love
> to have a dynamic solution that did estimates of max size based on available
> memory, but that is probably for an advanced implementation.

The Ogg decode has similar magic numbers for detecting 'too large' videos. Maybe move these into nsMediaDecoder.h and use them so the different backends are at least consistent.

> - maybe this is a candiate for about:config ? or at least a compile time
> switch?

Or this. Up to you.

> I have added a comment to the value that changing it could change performance
> in both directions.

Great, that's fine.

> 
> However if you insist I can initialise it to 0.

Thanks, please do so.
(In reply to comment #85)
> (In reply to comment #84)
> > Is there a special reason that you would like a test on the return value for
> > this particular function?   I don't see it as being more fatal for the decoder
> > than most of the other function calls that it make.
> 
> A couple of the third party libraries in the Ogg code don't check return values
> of supposedly safe calls in other third party libraries that they use and this
> has been a constant source of problems and vulnerabilities. I'd like to err on
> the side of caution of checking return values of third party libraries where
> they document that they can fail.

So, what you are asking is not specific to this function, but for all functions that can return an error value you want at least a log to happen if the call fails?  (alternatively the error needs to be handled).

Note that the current implementation is not ignoring failure, rather it assumes that a failure in one place will lead to a failure in another - where its either handled or ignored if there is no sensible way to react to the error.

> > I have added a description about lifetime etc.  and the purpose of it - but
> > with regards to who should inherit from it... is what you actually want a
> > design document? - where you can see all the classes and their relationship? 
> > Such a document could either be in a stand-alone file with pretty graphics or
> > embedded in one of the header files with square ascii graphics.
> 
> I just want better comments. Feel free to include what you want and I'll let
> you know if it's enough.
> 

Ok so you want an unspecified level of comments, I'll add some more generic comments to help get a birds eye view of the code, it might take some iterations before we agree on a level, but lets see how it goes.

> > These are magic numbers, there sole purpose is an attempt not to be able to
> > kill the browser by embedding an insane size of video on a page (1001 ways to
> > kill Fennec) - I have no preference for these numbers over others - would love
> > to have a dynamic solution that did estimates of max size based on available
> > memory, but that is probably for an advanced implementation.
> 
> The Ogg decode has similar magic numbers for detecting 'too large' videos.
> Maybe move these into nsMediaDecoder.h and use them so the different backends
> are at least consistent.

That solution is fine with me - and yes, there are no reason the different decoders should use different values.
Reply to comment #52

-----

+  // It's done in this way to allow GStreamer data types in the "real" class
+  static nsGStreamerDecoder *makeNewInstance();

I'm not sure what you mean by "to allow GStreamer data types in the 'real'
class". Can you expand on this?

^^^^

The new comment now reads:

  // Function to create a new instance of the GStreamer decoder
  //
  // NOTE: If we did it by letting the user instantiate an element of 
  // this class we would have to drag a lot of GStreamer specific
  // header files in together with this header file.

-----

nsGStreamerElementWrapper.h (multiple)

Add comments to the methods to explain what threads they can safely be called
on. Add assertions to the methods that are main thread only to ensure they are
called on the main thread only. eg:

NS_ASSERTION(NS_IsMainThread(), "Should be called on main thread");


+    nsHTMLMediaElement* mElement;

Add a comment about what threads can use mElement (ie. main thread only?). This
is not a counted pointer - how is ownership of mElement maintained? For an
example of the type of comment needed, see the 'mElement' comment in
nsMediaDecoder.h which describes the thread usage and why it's not reference
counted.

^^^^

I have added some new comments that hopefully makes your concerns disappear.

-----

Reply to comment #53

There are some repetition of patterns here that have been explained previously, these won't be repeated.

-----

nsGStreamerLib.cpp (209)

+void nsGStreamerLib::BringGStreamerDown()
+{
+  switch (mGStreamerLibStatus)

Reading mGStreamerLibStatus without taking the mGStreamerMutex lock. The switch
also reads mNumberOfInstances.

+  // The gstreamer lib is now down
+  mGStreamerLibStatus = GSTREAM_GDOWN_LIB_DOWN;

This should be protected by the lock?


^^^^
No, as it is the callers responsibility to take the lock before calling 
the function as you noted your self later - but that wasn't clear from 
the comments - comment has been added to the header file to clarify this as you suggest

-----

nsGStreamerLib.h (87)

+
+    // The last thing we should do, befor termination of the program
+    // is to terminate the GStreamer library
+    class Terminator
+    {
+      public:
+        ~Terminator();
+    };
+
+    static Terminator terminator;

Why not do this in the destructor of nsGStreamerLib instead of having the
helper object?

^^^^

Hehe ;)  I'm very happy you asked that question.  The reason I need a
static object to take the GStreamer library down, is that it can't be restarted after being shut down.  It is actually documented in the 
GStreamer documentation - thou, not as clear as it could be as one of the kind souls at #gstreamer pointed out to me.

For reference the current GStreamer docs say: "After this call GStreamer (including this method) should not be used anymore." - that includes also the init function (and yes I learned that the hard way)!

So taking the GStreamer lib down, needs to be the last thing that is done in the browser - at least it needs to be done at a time where we are sure we wont need the GStreamer lib any more - I can think of two cases of this - at the time global objects are destroyed, and after something so fatal has happened that we won't even try to run the GStreamer lib any more - which are the two cases it is called.

(I have now stated that the GStreamer lib can't be restarted again in the comment to nsGStreamerLib::Terminator)

-----

Reply to comment #54

In this comment we have the issue of uninitialised members again - it's not a functional problem as they will always be set before being used - but I fully understand the reasoning as to why there is a wish to initialise everything (maintenance over performance - I won't go into that discussion).

But if we are serious about initialising everything - what is the reason then that we don't do it per default through the memory allocation system? - (This would even improve security)

I'll write some kind of documentation on all the classes their relationship, thread safty etc. so I'll not comment further on this in the below.

-----

+++ b/content/media/gstreamer/nsGStreamerPipeline.cpp
+
+nsGStreamerPipeline::nsGStreamerPipeline()
+  : mStreamPipeline(nsnull),
+    mAudioPadSource(nsnull),
+    mAudioPadSink(nsnull),
+    mVideoPadSource(nsnull),
+    mVideoPadSink(nsnull)
+{
+}

Some member variables are not being set (eg. mStreamerDecoder)

^^^^

Are there others? - I can only spot the one

-----

nsGStreamerPipelineState.h (63)

+    // Destructor
+    ~nsGStreamerPipelineState();

Does this need to be virtual? Is it ever deleted using a pointer to one of its
base classes?

^^^^

No and no.

-----

nsGStreamerPipelineState.h (77)

+    // Tell state machine if it should play or stop(pause)
+    // bPlay = PR_TRUE => playback starts when conditions are right
+    // bPlay = PR_FALSE => playback is paused and will not start
+    virtual void PlayWhenPossible(PRBool aShouldPlay);

bPlay should be aShouldPlay?

^^^^

Indeed it should - that should teach me not to have the "exclude comments" option on per default when I do search-n-replace.

-----

nsGStreamerVideoBin.cpp (342)

+    aMe->mAspect = (pixel_aspect[0] == 0) ?
+      ((gdouble)aMe->mWidth/(gdouble)aMe->mHeight) :
+      ((gdouble)pixel_aspect[0] /
+       (gdouble)pixel_aspect[1] *
+       (gdouble)aMe->mWidth/(gdouble)aMe->mHeight);
...
+    aMe->mFramerate = (gdouble)pixel_aspect[0]/(gdouble)pixel_aspect[1];

Can pixel_aspect[1] ever be zero? If so, guard against division by zero.

^^^^
That's plain evil! - nice spot btw.  

-----

nsGStreamerVideoBin.hpp (124)

+    // Mutex to control the access to mIsScreenPreviouslyInvalidated
+    GMutex *mScreenInvalidMutex;
+
+    // Condition that is set when screen has been invalidated
+    GCond *mScreenInvalidatedCond;

Why not use NSPR locking types?

^^^^

No good reason here - except GStreamer is depending on GLib and all the examples in the GStreamer doc uses GLib - I thought about changing it but never got around to it - will do :)
tracking-fennec: 1.0+ → 1.0-
No longer depends on: 525895
No longer depends on: 525956
No longer depends on: 525958
No longer depends on: 525961
No longer depends on: 525964
No longer depends on: 525966
No longer depends on: 525968
No longer depends on: 525969
No longer depends on: 525975
In reply to comment #84.

It is not safe to call gst_object_unref(NULL); - it would trigger a Glib-Critical and the safe-guard can be compiled out even.

Mike K. would it be possible for you to stick a 
 GST_DEBUG_BIN_TO_DOT_FILE_WITH_TS(mStreamPipeline, 0, "firefox")
somewhere where the whole pipeline is setup (e.g. gst_fennecvideosink_change_state or ideally simillar for the actual pipleine object) and then run 
 GST_DEBUG_DUMP_DOT_DIR=$PWD ./firefox 

You will end up with graphviz dot files for the actual gst pipleine. Convert them to png/svg:
 dot -Tpng xxx.dot -oxxx.png
 dot -Tsvg xxx.dot -oxxx.svg
and attach a good one here. That would help to get the big picture how you actualy do things.
What exactly is the status on this bug at this time? Reading through these comments doesn't exactly give a great picture of what the progress is on this feature.
It's being worked on, addressing the issues pointed out in the review and from discussions with roc.  The major part of the work going on currently is related to coding style rather than functionality.  

The most important remaining issue as I see it is the performance on the N900 - this seem to have degraded over time - some months ago I ported the patch to a stand alone program, this had significantly better performance (as in a useful frame rate).  It remains to be investigated exactly why it is so slow when running within Fennec.
(In reply to comment #84)
> (In reply to comment #50)

nsGStreamerConfig.h (47)
+// Define the max size of video elements in pixels
+#define MAX_VIDEO_WIDTH 1024
+#define MAX_VIDEO_HEIGHT 1024

Is that code only for handheld device?
I see monitors bigger than 2048 x 2048
Please see 

http://www.nextag.com/Samsung-Syncmaster-305T-30-536319142/prices-html?nxtg=a9760a1c0514-A0AF65F162A79CAF
  2560x1600 resolution,

http://www.wired.com/gadgetlab/2007/11/toshiba-shows-c/
  3,840 x 2,400 pixel

you may want this to be made as preference
It will work in the same way as the OGG decoder
(In reply to comment #90)
> The most important remaining issue as I see it is the performance on the N900 -
> this seem to have degraded over time - some months ago I ported the patch to a
> stand alone program, this had significantly better performance (as in a useful
> frame rate).  It remains to be investigated exactly why it is so slow when
> running within Fennec.

Does this problem also exist when GStreamer is the engine used for HTML5 video/audio in regular desktop Firefox? Or is this issue specific to Fennec?
(In reply to comment #93)
> (In reply to comment #90)
> > The most important remaining issue as I see it is the performance on the N900 -
> > this seem to have degraded over time - some months ago I ported the patch to a
> > stand alone program, this had significantly better performance (as in a useful
> > frame rate).  It remains to be investigated exactly why it is so slow when
> > running within Fennec.
> 
> Does this problem also exist when GStreamer is the engine used for HTML5
> video/audio in regular desktop Firefox? Or is this issue specific to Fennec?

The framerate is fine when running a Fennec build on my laptop (Intel Core 2 Duo 2.13 GHz) - So on my PC, I don't see any reason why a regular Firefox build should have any problems.  But I'm sure if you try on a slow enough computer you will eventually get an unacceptable frame rate...  

However I don't think that is an answer to your question - but I don't have any data to give a better one.
(In reply to comment #94)
> (In reply to comment #93)
> > (In reply to comment #90)
> > > The most important remaining issue as I see it is the performance on the N900 -
> > > this seem to have degraded over time - some months ago I ported the patch to a
> > > stand alone program, this had significantly better performance (as in a useful
> > > frame rate).  It remains to be investigated exactly why it is so slow when
> > > running within Fennec.
> > 
> > Does this problem also exist when GStreamer is the engine used for HTML5
> > video/audio in regular desktop Firefox? Or is this issue specific to Fennec?
> 
> The framerate is fine when running a Fennec build on my laptop (Intel Core 2
> Duo 2.13 GHz) - So on my PC, I don't see any reason why a regular Firefox build
> should have any problems.  But I'm sure if you try on a slow enough computer
> you will eventually get an unacceptable frame rate...  
> 
> However I don't think that is an answer to your question - but I don't have any
> data to give a better one.

Then it sounds like the fault lies in Fennec. How much resources does Fennec consume on the N900?
(In reply to comment #95)
> (In reply to comment #94)
> > (In reply to comment #93)
> > > (In reply to comment #90)
> > > > The most important remaining issue as I see it is the performance on the N900 -
> > > > this seem to have degraded over time - some months ago I ported the patch to a
> > > > stand alone program, this had significantly better performance (as in a useful
> > > > frame rate).  It remains to be investigated exactly why it is so slow when
> > > > running within Fennec.
> > > 
> > > Does this problem also exist when GStreamer is the engine used for HTML5
> > > video/audio in regular desktop Firefox? Or is this issue specific to Fennec?
> > 
> > The framerate is fine when running a Fennec build on my laptop (Intel Core 2
> > Duo 2.13 GHz) - So on my PC, I don't see any reason why a regular Firefox build
> > should have any problems.  But I'm sure if you try on a slow enough computer
> > you will eventually get an unacceptable frame rate...  
> > 
> > However I don't think that is an answer to your question - but I don't have any
> > data to give a better one.
> 
> Then it sounds like the fault lies in Fennec. How much resources does Fennec
> consume on the N900?

hmm... I haven't been too concerned with performance so far - but just did a test run with "top" running while playing a small video on the N900, the outcome was (top 3 on CPU usage):

fennec/fennec: ~16% Mem ~50% CPU
/usr/bin/Xorg -logfile: /tmp/X ~13% Mem ~24% CPU
/usr/bin/pulseaudio --system: 2% Mem ~17% CPU

May I ask how you come to the conclusion that the fault is in Fennec?
(In reply to comment #96)
> (In reply to comment #95)
> > (In reply to comment #94)
> > > (In reply to comment #93)
> > > > (In reply to comment #90)
> > > > > The most important remaining issue as I see it is the performance on the N900 -
> > > > > this seem to have degraded over time - some months ago I ported the patch to a
> > > > > stand alone program, this had significantly better performance (as in a useful
> > > > > frame rate).  It remains to be investigated exactly why it is so slow when
> > > > > running within Fennec.
> > > > 
> > > > Does this problem also exist when GStreamer is the engine used for HTML5
> > > > video/audio in regular desktop Firefox? Or is this issue specific to Fennec?
> > > 
> > > The framerate is fine when running a Fennec build on my laptop (Intel Core 2
> > > Duo 2.13 GHz) - So on my PC, I don't see any reason why a regular Firefox build
> > > should have any problems.  But I'm sure if you try on a slow enough computer
> > > you will eventually get an unacceptable frame rate...  
> > > 
> > > However I don't think that is an answer to your question - but I don't have any
> > > data to give a better one.
> > 
> > Then it sounds like the fault lies in Fennec. How much resources does Fennec
> > consume on the N900?
> 
> hmm... I haven't been too concerned with performance so far - but just did a
> test run with "top" running while playing a small video on the N900, the
> outcome was (top 3 on CPU usage):
> 
> fennec/fennec: ~16% Mem ~50% CPU
> /usr/bin/Xorg -logfile: /tmp/X ~13% Mem ~24% CPU
> /usr/bin/pulseaudio --system: 2% Mem ~17% CPU
> 
> May I ask how you come to the conclusion that the fault is in Fennec?

Well, you said that Fennec ran fine on a faster machine, and the N900 is a lot slower. And also you mentioned that the performance was a lot better when you used a standalone program. GStreamer wouldn't change between using Fennec and the standalone application, so therefore, the bottleneck has to be in Fennec. 

Of course, that is rather flimsy if GStreamer itself had really bad performance. How much resources does GStreamer consume on the N900?
(In reply to comment #97)
> (In reply to comment #96)
> > (In reply to comment #95)
<snip>
> > 
> > hmm... I haven't been too concerned with performance so far - but just did a
> > test run with "top" running while playing a small video on the N900, the
> > outcome was (top 3 on CPU usage):
> > 
> > fennec/fennec: ~16% Mem ~50% CPU
> > /usr/bin/Xorg -logfile: /tmp/X ~13% Mem ~24% CPU
> > /usr/bin/pulseaudio --system: 2% Mem ~17% CPU
> > 
> > May I ask how you come to the conclusion that the fault is in Fennec?
> 
> Well, you said that Fennec ran fine on a faster machine, and the N900 is a lot
> slower. And also you mentioned that the performance was a lot better when you
> used a standalone program. GStreamer wouldn't change between using Fennec and
> the standalone application, so therefore, the bottleneck has to be in Fennec. 
> 
> Of course, that is rather flimsy if GStreamer itself had really bad
> performance. How much resources does GStreamer consume on the N900?

The corresponding numbers for the stand alone program (gstreamer_info) are:

./gstreamer_info: ~7% Mem ~26% CPU
/usr/bin/pulseaudio --system ~2% Mem ~19% CPU
/usr/bin/Xorg -logfile /tmp/X: 6% Mem ~7% CPU

So significantly less CPU usage for the stand alone program, even it can produce a higher framerate (Note: It's an older version of the GStreamer integration that is used, but I wouldn't expect the changes made to influence the numbers in any significant way)

---

When you said Fennec was at fault, I was not sure if you meant vs. FireFox, you didn't so I agree with you.
Mike, could it be that you use xvimagesink in the standalone and ffmpegcolorspace + ximagesink in the fennec version?

As I said earlier having a pipeline layout (dot->png/svg) would be helpful.
I was not able to generate a .dot file on the N900, so the attached is generated with a PC build of Fennec.  Note that when playing back on the N900 the GStreamer integration performs better than the "native" OGG decoder in Fennec (in the sense that with my test file the native decoder has gaps in the audio, where as when using GStreamer the audio is "fluent" without gaps - both decoders plays the video part more like a series of still images than a video)
(In reply to comment #99)
> Mike, could it be that you use xvimagesink in the standalone and
> ffmpegcolorspace + ximagesink in the fennec version?
> 
> As I said earlier having a pipeline layout (dot->png/svg) would be helpful.

The standalone program uses a GStreamer "filesrc" to provide data, and a "ffmpegcolorspace", "xvimagesink" pair for playback - some time ago I tried this same setup within Fennec, it gave better performance (more FPS) but still significantly worse than what I see in the standalone app.  

The pipeline layout is attached in the previous post.
(In reply to comment #101)
> (In reply to comment #99)
> > Mike, could it be that you use xvimagesink in the standalone and
> > ffmpegcolorspace + ximagesink in the fennec version?
> > 
> > As I said earlier having a pipeline layout (dot->png/svg) would be helpful.
> 
> The standalone program uses a GStreamer "filesrc" to provide data, and a
> "ffmpegcolorspace", "xvimagesink" pair for playback - some time ago I tried
> this same setup within Fennec, it gave better performance (more FPS) but still
> significantly worse than what I see in the standalone app.

If you use xvimagesink, you won't need the ffmpegcolorspace. At least on the N900 I would strongly advise to not use it unless needed. What formats can FennecVideoSink consume? Does it do software composition?

On the audioside:
Remove volume and use volume property on pulseaudio (that is on N900 I would suggest to configure pulsesink as the audiosink and not use gconfaudiosink - maybe a configure option)

remove audioconvert, pulse can take all raw formats

you might not even need the queue in mAudioBin.

Otherwise this is fine. One small question though. Do you need the app-src, you can't you use gst_element_make_from_uri() and let gst fetch the data directly?


> 
> The pipeline layout is attached in the previous post.
I am somewhat confused as to whether this patch is only meant for Fennec, or if it is also supposed to land in mainstream (desktop) Firefox. If so, is it targetted for a specific release of Firefox? I would greatly appreciate the ability to not have to resort to OGG video in combination with HTML5 video (MPEG-4/H.264 comes to mind).
I agree to Sjoerd Tieleman. If the N900 mobile device is supported and Nokia
gives money or support in return, that's nice.
Nevertheless I see the main purpose of this patch in the mainstream (desktop)
Firefox.
While many people want to make you believe OGG video quality is good, fact is,
it's nowhere near a good h.264 encoding (x264).

I am looking forward to GStreamer support and hope it's going to reduce Flash
dependency. =)
My exclusive focus for this GStreamer integration has been to make it work with Fennec.  Not enabling it in Firefox on platforms where GStreamer has been ported to, is as far as I know a political decision.  

My personal view is that we should enable it in Firefox too to give the users more choices (I don't want to see Firefox support less formats than any other browser, I rather see that if a user has some content, then he knows he can rely on Firefox/Fennec to play it back), but I have heard good arguments for not doing so.

Maybe someone can repeat the arguments against enabling GStreamer in the mainstream Firefox?
On the contrary, i think using gstreamer on linux, even with some limitations like only allowing to read Ogg files, would be beneficial because it would solve a lot of headaches with sound configuration (alsa vs oss vs pulseaudio vs..., correct selection of the proper sound device when there are several, etc.).
(In reply to comment #105)
> My exclusive focus for this GStreamer integration has been to make it work with
> Fennec.  

Oh really? Why doesn't this surprise? When will someone from Mozilla confess that nothing has really happened there for 18 months that isn't focused on some mobiles browser that can only be run on very few phones? I live in Australia and largely engage with Mozilla merely via Planet Mozilla however even with that limited exposure it has been drop-dead obvious how much focus has gone onto Fennec. 

It's bad enough that Mozilla doesn't innovate, rather it lets extension developers innovate for it, but now it's hardly even working on Firefox. 

I realise that supposedly anything coded for Fennec can be compiled into Firefox but ...

>Not enabling it in Firefox on platforms where GStreamer has been
> ported to, is as far as I know a political decision.  

Typical! I argued the case for DirectShow support for H.264 on Windows given that Microsoft delivers that codec for free via Win 7. This was before the current tacky video support was released. Nobody had enough fortitude to think beyond the same old dogma for not making such a move. Then Google went and delivered H.264 video codecs as well! Mozilla I love Firefox but you're making a lot of decisions that are compromising it's future.
 
> My personal view is that we should enable it in Firefox too to give the users
> more choices (I don't want to see Firefox support less formats than any other
> browser, I rather see that if a user has some content, then he knows he can
> rely on Firefox/Fennec to play it back), but I have heard good arguments for
> not doing so.

Arguments against giving Firefox multi-codec support are one thing. However when the Firefox marketing spin then cranks up the hyperbole about 3.5 "enabling video for the web" I feel a rush of semi-digested food forcing it's way up my esophagus.

Oops, wait, it is usually about now when people reading my comments start to think that it's a good idea to tell me not to include non-technical commentary in bugzilla. I better shut up. 
 
> Maybe someone can repeat the arguments against enabling GStreamer in the
> mainstream Firefox?

Careful Mike, that sounds dangerously like you are encouraging discussion in a bug fixing posting. Arrgh, watch out for the bugzilla thought police email you'll get any second :)
(In reply to comment #107)
> (In reply to comment #105)
<SNIP>
> > Maybe someone can repeat the arguments against enabling GStreamer in the
> > mainstream Firefox?
> 
> Careful Mike, that sounds dangerously like you are encouraging discussion in a
> bug fixing posting. Arrgh, watch out for the bugzilla thought police email
> you'll get any second :)

*LOL* Yes, I was considering the discussion to be moved to the Newsgroup, we really need a unified system... ARggghhh... wrong place to state such things - "All is well with the system we have" sorry.  Anyone to start a discussion in the newsgroups? -

Meanwhile I will try to figure out why my performance measurements on the integration fluctuates 40-50% which makes it very hard to establish a baseline to see how much the suggestions by Stefan Kost are improving the FPS....
(In reply to comment #107)
> (In reply to comment #105)
> > My exclusive focus for this GStreamer integration has been to make it work with
> > Fennec.  
> 
> Oh really? Why doesn't this surprise? When will someone from Mozilla confess
> that nothing has really happened there for 18 months that isn't focused on some
> mobiles browser that can only be run on very few phones? I live in Australia
> and largely engage with Mozilla merely via Planet Mozilla however even with
> that limited exposure it has been drop-dead obvious how much focus has gone
> onto Fennec. 
> 
> It's bad enough that Mozilla doesn't innovate, rather it lets extension
> developers innovate for it, but now it's hardly even working on Firefox. 
> 
> I realise that supposedly anything coded for Fennec can be compiled into
> Firefox but ...
> 

I think the main reason for the hyperfocus on Fennec is because Gecko is a huge memory hog, and it will take a lot of work to whip Gecko into shape for it to be decent to use on mobile devices. The side effect of this is that desktop Firefox memory and CPU usage is likely to go way down.

> >Not enabling it in Firefox on platforms where GStreamer has been
> > ported to, is as far as I know a political decision.  
> 
> Typical! I argued the case for DirectShow support for H.264 on Windows given
> that Microsoft delivers that codec for free via Win 7. This was before the
> current tacky video support was released. Nobody had enough fortitude to think
> beyond the same old dogma for not making such a move. Then Google went and
> delivered H.264 video codecs as well! Mozilla I love Firefox but you're making
> a lot of decisions that are compromising it's future.
> 

I personally don't think supporting DirectShow in Windows Firefox directly is a good idea because many codecs for Windows DirectShow aren't designed very well to handle streaming. Or not well overall. In any case, GStreamer can bridge Windows DirectShow into itself, so if GStreamer was enabled for Windows Firefox, you could still get to use the Windows H.264 codec.

> > My personal view is that we should enable it in Firefox too to give the users
> > more choices (I don't want to see Firefox support less formats than any other
> > browser, I rather see that if a user has some content, then he knows he can
> > rely on Firefox/Fennec to play it back), but I have heard good arguments for
> > not doing so.
> 
> Arguments against giving Firefox multi-codec support are one thing. However
> when the Firefox marketing spin then cranks up the hyperbole about 3.5
> "enabling video for the web" I feel a rush of semi-digested food forcing it's
> way up my esophagus.
> 
> Oops, wait, it is usually about now when people reading my comments start to
> think that it's a good idea to tell me not to include non-technical commentary
> in bugzilla. I better shut up. 
> 

I really don't agree with not enabling GStreamer for Windows, Mac, and Linux. On Linux, GStreamer is always there, and the work required to get HTML5 video and audio to work is greatly reduced with GStreamer used (Firefox doesn't seem to like to always play nice with PulseAudio, but that's only expected, since it really only works with raw ALSA). On Windows, GStreamer would allow users to plug in their own codecs, even if Mozilla themselves only shipped the Ogg codecs. And on Mac, GStreamer would allow QuickTime codecs to plug into Firefox.

Additionally, it would probably be easier to support subtitles and the myriad of containers that movies come in. For example, I have a collection of anime video in the Matroska container that use the Ogg Theora/Vorbis codecs along with providing subtitles. It would be very nice if Firefox could support Matroska, subtitles, and multi-audio stream videos. English isn't the only language in the world, as Mozilla knows very well! ;)

> > Maybe someone can repeat the arguments against enabling GStreamer in the
> > mainstream Firefox?
> 
> Careful Mike, that sounds dangerously like you are encouraging discussion in a
> bug fixing posting. Arrgh, watch out for the bugzilla thought police email
> you'll get any second :)

I think the main reasons against enabling GStreamer in mainstream Firefox were political, rather than technical, because the original patch for GStreamer backend in HTML5 <video> and <audio> was designed for desktop Firefox.

(In reply to comment #108)
> (In reply to comment #107)
> > (In reply to comment #105)
> <SNIP>
> > > Maybe someone can repeat the arguments against enabling GStreamer in the
> > > mainstream Firefox?
> > 
> > Careful Mike, that sounds dangerously like you are encouraging discussion in a
> > bug fixing posting. Arrgh, watch out for the bugzilla thought police email
> > you'll get any second :)
> 
> *LOL* Yes, I was considering the discussion to be moved to the Newsgroup, we
> really need a unified system... ARggghhh... wrong place to state such things -
> "All is well with the system we have" sorry.  Anyone to start a discussion in
> the newsgroups? -
> 
> Meanwhile I will try to figure out why my performance measurements on the
> integration fluctuates 40-50% which makes it very hard to establish a baseline
> to see how much the suggestions by Stefan Kost are improving the FPS....

Newsgroups... :(
Don't we have an IRC channel or something we could discuss this in? Realtime discussion is so much better than newsgroups, and mailing lists are EVIL!!!!
(In reply to comment #109)
> Additionally, it would probably be easier to support subtitles and the myriad
> of containers that movies come in. For example, I have a collection of anime
> video in the Matroska container that use the Ogg Theora/Vorbis codecs along
> with providing subtitles. It would be very nice if Firefox could support
> Matroska, subtitles, and multi-audio stream videos. English isn't the only
> language in the world, as Mozilla knows very well! ;)
> 
> I think the main reasons against enabling GStreamer in mainstream Firefox were
> political, rather than technical, because the original patch for GStreamer
> backend in HTML5 <video> and <audio> was designed for desktop Firefox.

Absolutely. There is no reason for Mozilla to reinvent the wheel. Just use the existing GStreamer backend.
Furthermore Mozilla can live up to it's political agenda to only support the "free" codes natively. While h.264, AAC and so forth (in short all the good stuff with awesome compression ratio, popularity, and hardware support) are decoded with GStreamer.

> Newsgroups... :(
> Don't we have an IRC channel or something we could discuss this in? Realtime
> discussion is so much better than newsgroups, and mailing lists are EVIL!!!!

See http://irc.mozilla.org/
How about irc://irc.mozilla.org/mozdev ?
Perhaps we can kick off a discussion :)

Lg kwinz
(In reply to comment #110)
> (In reply to comment #109)
> > Additionally, it would probably be easier to support subtitles and the myriad
> > of containers that movies come in. For example, I have a collection of anime
> > video in the Matroska container that use the Ogg Theora/Vorbis codecs along
> > with providing subtitles. It would be very nice if Firefox could support
> > Matroska, subtitles, and multi-audio stream videos. English isn't the only
> > language in the world, as Mozilla knows very well! ;)
> > 
> > I think the main reasons against enabling GStreamer in mainstream Firefox were
> > political, rather than technical, because the original patch for GStreamer
> > backend in HTML5 <video> and <audio> was designed for desktop Firefox.
> 
> Absolutely. There is no reason for Mozilla to reinvent the wheel. Just use the
> existing GStreamer backend.
> Furthermore Mozilla can live up to it's political agenda to only support the
> "free" codes natively. While h.264, AAC and so forth (in short all the good
> stuff with awesome compression ratio, popularity, and hardware support) are
> decoded with GStreamer.
> 

Exactly. Though, Mozilla still wouldn't have to distribute the "other" codecs with their GStreamer package. The "free" codecs could be included in their GStreamer distribution included with Firefox. Opera is taking that route too: http://my.opera.com/core/blog/2009/12/31/re-introducing-video

However, they have no plans to include the DirectShow and QuickTime bridges for GStreamer-win32 and GStreamer-OSX.

> > Newsgroups... :(
> > Don't we have an IRC channel or something we could discuss this in? Realtime
> > discussion is so much better than newsgroups, and mailing lists are EVIL!!!!
> 
> See http://irc.mozilla.org/
> How about irc://irc.mozilla.org/mozdev ?
> Perhaps we can kick off a discussion :)
> 

I think irc://irc.mozilla.org/firefox would be a better choice for discussion...
I think the first step towards evaluating the suitability of gstreamer across all platforms should be to get it working, reviewed and landed. Then we CSM see how well it works on other platforms and codecs. 

What would be the plan got other platforms lilt os/2, aix and solaris? Do they have gstreamer ports?
(In reply to comment #112)
> I think the first step towards evaluating the suitability of gstreamer across
> all platforms should be to get it working, reviewed and landed. Then we CSM see
> how well it works on other platforms and codecs. 
> 
> What would be the plan got other platforms lilt os/2, aix and solaris? Do they
> have gstreamer ports?

GStreamer has support for Windows, UNIX OSes (such as Mac OS X, Solaris, AIX, *BSD, etc.), and Linux. 

For Windows, the binaries haven't been updated in some time, but it still does build for Windows. Opera is using it and they are providing source code of their modifications: http://sourcecode.opera.com/gstreamer/

For Mac, I haven't really seen any builds of GStreamer for Mac other than in Songbird, but it is because of Songbird that I know GStreamer does work on Mac OS X.

It should be quite obvious that GStreamer works quite well on Linux, given the numerous GNOME applications that use it.

I have observed support for GStreamer on Solaris/OpenSolaris and *BSD, but I haven't used other UNIX OSes. However, an IBM document about GStreamer indicates that it is supported on AIX: http://www.ibm.com/developerworks/aix/library/au-gstreamer.html

As far as I know, GStreamer isn't supported on OS/2 currently.

I have already mentioned this before, but I'm reiterating for the sake of completeness: GStreamer has a DirectShow bridge for Windows, and a QuickTime bridge for Mac OS X, which allows it to use A/V input and output devices as well as codecs from the aforementioned frameworks within GStreamer and all GStreamer enabled applications.
Spamming this bug with advocacy comments is actually going to make it harder to track the technical issues, which will make it harder to get GStreamer support landed. I don't think that's what you want. I suggest the mozilla.dev.tech.platform newsgroup.
(In reply to comment #102)
<SNIP>
> 
> If you use xvimagesink, you won't need the ffmpegcolorspace. At least on the
> N900 I would strongly advise to not use it unless needed. What formats can
> FennecVideoSink consume? Does it do software composition?

The FennecVideoSink is a "pass-through" element, which has the purpose of moving data out of the GStreamer pipeline and into the MediaDecoder - it doesn't have any intelligent handling of the data (actually wondering if AppSink would be more appropriate).  Its caps is set to: GST_VIDEO_CAPS_BGRx as this is what the MediaDecoder expects the data in (this might be changing btw).

> 
> On the audioside:
> Remove volume and use volume property on pulseaudio (that is on N900 I would
> suggest to configure pulsesink as the audiosink and not use gconfaudiosink -
> maybe a configure option)

Actually gconfaudiosink doesn't work pr. default on the N900 - so it falls back to using AutoAudioSink.  Do you think this gives any performance penalties?  - The reasoning behind it is:

1) If the system has a GStreamer configuration, then use that one (gconfaudiosink)

2) If there is no configuration or the audio stream can't link with it, then use the best sink (AutoAudioSink) 
 
> remove audioconvert, pulse can take all raw formats

I don't assume all platforms will have a pulse audio sink? (at least the N810 didn't pr default)  We could make it dependent on the formats supported by the actually sink used...

> 
> you might not even need the queue in mAudioBin.

I tried removing the queue, it meant I saw a tendency for the fps to go down, but more importantly the audio started to get choppy:

queue+convert+volume+autoaudiosink - 1.61 fps
                     autoaudiosink - 1.58 fps (choppy audio)
queue               +autoaudiosink - 1.69 fps (only two measurements)
queue        +volume+autoaudiosink - 1.63 fps
queue+convert       +autoaudiosink - 1.67 fps

(The above measurements are an average of 3 runs on a ~44s video - the biggest difference I saw between measurements in a given setup was < 0.05 fps)

> 
> Otherwise this is fine. One small question though. Do you need the app-src, you
> can't you use gst_element_make_from_uri() and let gst fetch the data directly?

As far as I understand using gst_element_make_from_uri() would mean bypassing the caching, security etc. that Firefox takes care of?
(In reply to comment #115)
> (In reply to comment #102)
> > Otherwise this is fine. One small question though. Do you need the app-src,
> > can't you use gst_element_make_from_uri() and let gst fetch the data directly?
> 
> As far as I understand using gst_element_make_from_uri() would mean bypassing
> the caching, security etc. that Firefox takes care of?

Right. HTTP caching, cookies, HTTP auth, HTTPS, HTML5 offline appcache, stream throttling, media cache ... they all depend on Gecko handling the load.
(In reply to comment #116)
> > As far as I understand using gst_element_make_from_uri() would mean bypassing
> > the caching, security etc. that Firefox takes care of?
> 
> Right. HTTP caching, cookies, HTTP auth, HTTPS, HTML5 offline appcache, stream
> throttling, media cache ... they all depend on Gecko handling the load.

On the other hand, that could be some unexpected overhead. At least a benchmark using gst_element_make_from_uri() could be worth doing, to get an idea.
That might provide useful data, but in the long run we have to use Gecko to do the laod.
I was not suggesting to actually use it. Only to get an idea if Gecko adds a significant overhead on this part. If it appears it doesn't, at least we'll know there's nothing to be done on that part in Gecko.
(In reply to comment #115)
> (In reply to comment #102)
> <SNIP>
> > 
> > If you use xvimagesink, you won't need the ffmpegcolorspace. At least on the
> > N900 I would strongly advise to not use it unless needed. What formats can
> > FennecVideoSink consume? Does it do software composition?
> 
> The FennecVideoSink is a "pass-through" element, which has the purpose of
> moving data out of the GStreamer pipeline and into the MediaDecoder - it
> doesn't have any intelligent handling of the data (actually wondering if
> AppSink would be more appropriate).  Its caps is set to: GST_VIDEO_CAPS_BGRx as
> this is what the MediaDecoder expects the data in (this might be changing btw).
> 

This is not good. If you force RGB caps you request colorspace conversion from ffmpegcolorspace. First ffmpegcolorspace is adding setup overhead (thats why the n900 mediaplayer is avoiding it). 2nd the colorspace conversion is not even neon optimized. 3rd the colorspace conversion is doing a memory to memory operation. Also FenencVideoSink is doing another copy in gst_fennecvideosink_put(). If there is an alternative to SetRGBData() to get the location, it would be better to implement pad_alloc() so that the colorspace converter can write the video into the target buffer directly.

Please remove these from gst_fennecvideosink_put() 
+      gst_buffer_ref(aFennecvideo);
+      gst_buffer_unref(aFennecvideo);

> > 
> > On the audioside:
> > Remove volume and use volume property on pulseaudio (that is on N900 I would
> > suggest to configure pulsesink as the audiosink and not use gconfaudiosink -
> > maybe a configure option)
> 
> Actually gconfaudiosink doesn't work pr. default on the N900 - so it falls back
> to using AutoAudioSink.  Do you think this gives any performance penalties?

Only marginal ones from the deep bin nesting.

>  -
> The reasoning behind it is:
> 
> 1) If the system has a GStreamer configuration, then use that one
> (gconfaudiosink)
> 
> 2) If there is no configuration or the audio stream can't link with it, then
> use the best sink (AutoAudioSink) 
> 
> > remove audioconvert, pulse can take all raw formats
> 
> I don't assume all platforms will have a pulse audio sink? (at least the N810
> didn't pr default)  We could make it dependent on the formats supported by the
> actually sink used...
> 
> > 
> > you might not even need the queue in mAudioBin.
> 
> I tried removing the queue, it meant I saw a tendency for the fps to go down,
> but more importantly the audio started to get choppy:
> 
> queue+convert+volume+autoaudiosink - 1.61 fps
>                      autoaudiosink - 1.58 fps (choppy audio)
> queue               +autoaudiosink - 1.69 fps (only two measurements)
> queue        +volume+autoaudiosink - 1.63 fps
> queue+convert       +autoaudiosink - 1.67 fps
> 
> (The above measurements are an average of 3 runs on a ~44s video - the biggest
> difference I saw between measurements in a given setup was < 0.05 fps)
> 

Okay, this tells that audio is not the big cpu consumer anyway as even dropping convert and volume does not help much.

> > 
> > Otherwise this is fine. One small question though. Do you need the app-src, you
> > can't you use gst_element_make_from_uri() and let gst fetch the data directly?
> 
> As far as I understand using gst_element_make_from_uri() would mean bypassing
> the caching, security etc. that Firefox takes care of?

As others commented, it might be worth trying it. Anyway, I belive the problem is the video rendering part. Whatever happens behind fennecvideosink, if you can make that accept YUV data and if you can avoid the memcpy you should see a big performance gain. On n900 memory bandwidth could be better and doing 2 copies of raw video buffers in a row is definitely impacting the fps badly.

Thanks for testing all the suggestions!
One more comment. gst_fennecvideosink_buffer_alloc() is nonsense. Please disable that as it is. Providing a buffer_alloc only makes sense if you *don't need to do a new[] in there. Like if you have a pool of lets say 4 buffers and recycle them. It makes even more sense if those buffers come from whatever composes the final view.
It would probably be best to benchmark in a standalone XUL app instead of Fennec, for now.

We are setting up infrastructure so that the YUV conversion can be done in hardware and the resulting browser window be composited together with OpenGL. See bug 538323.
(In reply to comment #119)
> I was not suggesting to actually use it. Only to get an idea if Gecko adds a
> significant overhead on this part. If it appears it doesn't, at least we'll
> know there's nothing to be done on that part in Gecko.

The measured results are (playing back from local file on the device):

app_src                                : 1.62 fps
        gst_element_make_from_uri      : 1.31 fps
        gst_element_make_from_uri+queue: 1.65 fps

NOTE: I have changed the fps calculating algorithm slightly (more correct now) - this is the reason why the "baseline" has moved from 1.61 fps to 1.62 fps.

I'm continuing trying your other suggestions.
(In reply to comment #120)
<SNIP>
> As others commented, it might be worth trying it. Anyway, I belive the problem
> is the video rendering part. Whatever happens behind fennecvideosink, if you
> can make that accept YUV data and if you can avoid the memcpy you should see a
> big performance gain. On n900 memory bandwidth could be better and doing 2
> copies of raw video buffers in a row is definitely impacting the fps badly.

For the sole purpose of testing the penalty of having the YUV->RGB conversion done by ffmpegcolorspace - I modified the fennecvideosink to accept YUV and pushing this unmodified to the screen (the output on screen is more or less nonsense) as the YUV is 16-bit pr pixel and the SetRGBData expects 32-bit I ran the test with half the normal width (so the measurements can't be compared to the others that I have posted):

with ffmpegcolorspace: 2.44 fps
without ffmpegcolorspace: 2.77 fps

So a significant difference > 10% but not like earth shaking 

for reference the original video is 25 fps
(In reply to comment #124)
> (In reply to comment #120)
> <SNIP>
> > As others commented, it might be worth trying it. Anyway, I belive the problem
> > is the video rendering part. Whatever happens behind fennecvideosink, if you
> > can make that accept YUV data and if you can avoid the memcpy you should see a
> > big performance gain. On n900 memory bandwidth could be better and doing 2
> > copies of raw video buffers in a row is definitely impacting the fps badly.
> 
> For the sole purpose of testing the penalty of having the YUV->RGB conversion
> done by ffmpegcolorspace - I modified the fennecvideosink to accept YUV and
> pushing this unmodified to the screen (the output on screen is more or less
> nonsense) as the YUV is 16-bit pr pixel and the SetRGBData expects 32-bit I ran
> the test with half the normal width (so the measurements can't be compared to
> the others that I have posted):
> 
> with ffmpegcolorspace: 2.44 fps
> without ffmpegcolorspace: 2.77 fps
> 
> So a significant difference > 10% but not like earth shaking 
> 
> for reference the original video is 25 fps

Whats the resolution of the video? Could you also modify FennecVideosink to just drop the received buffer (gst_buffer_unref()) right away (so that nothing is seen). If that is not changing the numbers significantly then it is not the rendering.
We know rendering to the screen is slow, and the layers work will make it massively better.  I wouldn't spend too much time measuring it at this point.
(In reply to comment #125)
<SNIP>
> Whats the resolution of the video? Could you also modify FennecVideosink to
> just drop the received buffer (gst_buffer_unref()) right away (so that nothing
> is seen). If that is not changing the numbers significantly then it is not the
> rendering.

The resolution is: 360 x 288

It's an excellent idea to test the framerate without actually redrawing the screen - now why didn't I think of that? - anyway the results are:

Without drawing                             : 22.84 fps
Without drawing and without ffmpegcolorspace: 23.54 fps 

(In reply to comment #126)
> We know rendering to the screen is slow, and the layers work will make it
> massively better.  I wouldn't spend too much time measuring it at this point.

I understand your worry - but without measuring it we wouldn't know if the rendering was the main problem for the GStreamer integration.  If something else took the resources there would be less frames to render, hence a slow rendering would have less of an impact. 

Now having the above measurement, we can't say that the rendering is the only thing slowing it down, but we can say that the rendering is the main thing we have found so far and that the color conversion is not that big a problem < 5% *)- what could be interesting to see now is how many frames per second that the current rendering engine can handle - or more ideally how big a % of the time is spend decoding and how much is spend moving the decoded frames to the display, something that I don't know how to measure on the N900?

*) I have noted the difference between the < 5% here and the > 10% impact measured at the low frame-rate but can't come with a definitive explanation of why.  I can theorize that it has to do with the system at times during playback is limited by the frame-rate of 25 fps of the video rather than the available resources, but I can't prove it at this point (btw I'm not looking to prove it either).
Could somebody update the patch to work with the Firefox 3.6 source tree? The current patch fails to be applied to it.
Patch updated to tot:

Applies on top of (mozilla-central):
changeset:   37404:97745a2b2de9
tag:         tip
user:        Dão Gottwald <dao@mozilla.com>
date:        Fri Jan 22 09:44:16 2010 +0100
summary:     Bug 482065 - Allow rearranging of thumbnails in the all tabs panel. r=enn
(mobile):
changeset:   1318:0f1676f43e39
tag:         tip
user:        Brad Lassey <blassey@mozilla.com>
date:        Thu Jan 21 12:48:08 2010 -0500
summary:     bug 537181 - Add required prefs and set proper config options to support app updating r=mfinkle a=stuart
Attachment #406209 - Attachment is obsolete: true
(In reply to comment #128)
> Could somebody update the patch to work with the Firefox 3.6 source tree? The
> current patch fails to be applied to it.

Sorry can't help you there, but have just updated it to something more recent.
I have a couple questions about this implantation:

- Will this support using GST plugins for output (i.e. using alsa/oss/puls/etc. for audio and gl/xvideo/XvMC/VDPAU/etc. for video)? Or will it only be used for demuxing and decoding?

- Will it supersede the already built-in OGG/Theora/Vorbis/etc. support if available (there are GST plugins)? This is really only meaningful if GST output plugins are used.
(In reply to comment #131)
> I have a couple questions about this implantation:
> 
> - Will this support using GST plugins for output (i.e. using alsa/oss/puls/etc.
> for audio and gl/xvideo/XvMC/VDPAU/etc. for video)? Or will it only be used for
> demuxing and decoding?

The current integration, first tries "gconfaudiosink" (I believe that means, what ever the users system is configured to use) and in case that hasn't been setup or can't link with the specific audio stream it uses "autoaudiosink" which will let the GStreamer library choose the audio sink.

> - Will it supersede the already built-in OGG/Theora/Vorbis/etc. support if
> available (there are GST plugins)? This is really only meaningful if GST output
> plugins are used.

No, the system will first try if any of the build in codecs can handle the decoding, if they don't support the format, then GStreamer will try to decode it (using "decodebin2"), the integration does not limit or control which decoder the decodebin2 chooses, so if there is a GStreamer decoder installed in the system it should be a candidate.

Note that as you are probably aware: Initially this integration is only meant for Maemo devices running Fennec - and a discussion of suggestion for other platforms should take place in the newsgroups and not here.
(In reply to comment #132)
> The current integration, first tries "gconfaudiosink" (I believe that means,
> what ever the users system is configured to use) and in case that hasn't been
> setup or can't link with the specific audio stream it uses "autoaudiosink"
> which will let the GStreamer library choose the audio sink.

Sorry, I should have looked closer at the implementation. I noticed that the video is being handled with a Fennec video sink, which Roc mentioned would be composited and output as OpenGL.

While this is probably much better than the current software rendering, I think that the use of GST's video sinks (i.e. a more direct path to hardware accelerated video). I think that would be a definite boost to the video watching experience as well as XULRunner's performance and resource usage.

This should definitely be possible, as many GTK/Qt media applications can do this.

Though, at this point, I'm not sure how useful it would be to Fennic.

> No, the system will first try if any of the build in codecs can handle the
> decoding, if they don't support the format, then GStreamer will try to decode
> it (using "decodebin2"), the integration does not limit or control which
> decoder the decodebin2 chooses, so if there is a GStreamer decoder installed in
> the system it should be a candidate.

Again, not sure how pertinent this is to Fennic, but on other systems this could lead to undesirable outcomes. IIRC XULRunner only supports ALSA (on Linux). If a user configured GST to use pulseaudio, different audio back-ends would be used for different media.

The same would be true if native GST video sinks were ever used.

IMHO if GST is supported (and initialized) on a platform, it should take precedence.

> Note that as you are probably aware: Initially this integration is only meant
> for Maemo devices running Fennec - and a discussion of suggestion for other
> platforms should take place in the newsgroups and not here.

I was not aware of that. That seems like an odd decision. It seems like this would get a much larger testing audience if it were on a first class platform (i.e. Linux).

Another thing I forgot to ask. Does this cover GST support for the Audio element as well?

Also, forgive my ignorance, but where is the newsgroup where this should be discussed?
(In reply to comment #133)
> (In reply to comment #132)
> > The current integration, first tries "gconfaudiosink" (I believe that means,
> > what ever the users system is configured to use) and in case that hasn't been
> > setup or can't link with the specific audio stream it uses "autoaudiosink"
> > which will let the GStreamer library choose the audio sink.
> 
> Sorry, I should have looked closer at the implementation. I noticed that the
> video is being handled with a Fennec video sink, which Roc mentioned would be
> composited and output as OpenGL.
> 
> While this is probably much better than the current software rendering, I think
> that the use of GST's video sinks (i.e. a more direct path to hardware
> accelerated video). I think that would be a definite boost to the video
> watching experience as well as XULRunner's performance and resource usage.

If we look at the raw performance, then it will probably be hard to beat a solution where everything is kept inside GStreamer, with a for a given platform optimized GStreamer video sink as you suggest.  

However it will probably simplify things a bit if all the video output from the different decoders are done in the same way (I guess the solution Roc has told you about).  

It is relatively simple to change the output sink in the integration I have done, so when we have the generic solution it should be easy to measure and compare the performance of a pure GStreamer solution and one where we have a more generic way of displaying the individual video frames to see if it's worth the extra work.


<SNIP>
> Again, not sure how pertinent this is to Fennic, but on other systems this
> could lead to undesirable outcomes. IIRC XULRunner only supports ALSA (on
> Linux). If a user configured GST to use pulseaudio, different audio back-ends
> would be used for different media.

If it is at one time decided that we should use GStreamer on other platforms, the requirements for the integration will change, and hence the solution might change...

<SNIP>
> Also, forgive my ignorance, but where is the newsgroup where this should be
> discussed?

I think mozilla.dev.platform (server: news.mozilla.org) was suggested previously ?  (where btw I would be happy to comment on your valid points about where this should be used which, I have left out above)
(In reply to comment #121)
> One more comment. gst_fennecvideosink_buffer_alloc() is nonsense. Please
> disable that as it is. Providing a buffer_alloc only makes sense if you *don't
> need to do a new[] in there. Like if you have a pool of lets say 4 buffers and
> recycle them. It makes even more sense if those buffers come from whatever
> composes the final view.

Sorry missed giving an answer to this one previously - the reason for using the buffer_alloc function is that ownership of the buffer needs to be passed on to the media framework, which will use delete [] to release the memory, hence it needs to be allocated with new [] - GStreamer uses alloc/free which are not compatible with new/delete.
(In reply to comment #135)
> (In reply to comment #121)
> > One more comment. gst_fennecvideosink_buffer_alloc() is nonsense. Please
> > disable that as it is. Providing a buffer_alloc only makes sense if you *don't
> > need to do a new[] in there. Like if you have a pool of lets say 4 buffers and
> > recycle them. It makes even more sense if those buffers come from whatever
> > composes the final view.
> 
> Sorry missed giving an answer to this one previously - the reason for using the
> buffer_alloc function is that ownership of the buffer needs to be passed on to
> the media framework, which will use delete [] to release the memory, hence it
> needs to be allocated with new [] - GStreamer uses alloc/free which are not
> compatible with new/delete.

Btw, I know of the free_func member of the gst_buffer type - but it was not introduced until 0.10.22 which is more recent than our current build chain for Fennec supports and was made with the support of the N810 is mind (which has now been dropped as far as GStreamer video is concerned), hence the alternative "risky" way of handling it that currently exists in the code.  I'm considering alternatives...
(In reply to comment #133)
> Sorry, I should have looked closer at the implementation. I noticed that the
> video is being handled with a Fennec video sink, which Roc mentioned would be
> composited and output as OpenGL.

... soon, hopefully, with the layer system, but not yet ...

> While this is probably much better than the current software rendering, I
> think that the use of GST's video sinks (i.e. a more direct path to hardware
> accelerated video). I think that would be a definite boost to the video
> watching experience as well as XULRunner's performance and resource usage.
> 
> This should definitely be possible, as many GTK/Qt media applications can do
> this.

It's essential for browser <video> rendering to participate fully in the browser's rendering pipeline. This means that we need to be able to draw HTML content over the <video> (including a Web page's video controls --- even the browser's video controls --- and arbitrary transparent/translucent content), we need to be able to clip <video> to arbitrary rectangles or even paths (using SVG's clip-path), we need to be able to transform <video> using CSS and SVG transforms, we need to be able to apply arbitrary pixel operations using SVG filters, we need to be able to draw video to an HTML5 <canvas>, and we need to be able to take snapshots of <video> frames for printing and print preview. Can you do all that with GStreamer video sinks? I guess we could if they all support readback.

> Another thing I forgot to ask. Does this cover GST support for the Audio
> element as well?

Yes, it should.

> Also, forgive my ignorance, but where is the newsgroup where this should be
> discussed?

mozilla.dev.platform on news.mozilla.org would be a good place.
The plan in bug 538323 for the built-in Theora codec boils down to, for a GL layers backend: decode Theora to a YUV layer and then, if the layer is not nested in a tricky container (clip-path, filters, transforms, etc), composite it together with the other layers (including separate layer(s) for content over the video) using GL, with YUV conversion happening somewhere along the way, possibly only at draw time using a GL shader, or possibly earlier using bc-cat texture streaming on the N900 for example. If the video is in a container the layer system can't handle yet (clip-path, filters, transforms, etc), we'll have to read back RGB data from the video layer and draw that using cairo. That will be slow, but we'll incrementally fix the slowness by teaching layers how to handle the various container effects.

It seems to me that the easiest way to integrate GStreamer here is to get YUV buffers from GStreamer so that we can reuse that rendering infrastructure. The next easiest way would be to get back a GL texture that the GL compositor can use. In the general case, where we have to composite video together with other layers, I think we'll need the video to be available as something equivalent to a GL texture.
--- Comment #132 from Mike Kristoffersen <mkristoffersen@mozilla.com> 2010-01-25 00:27:08 
> Note that as you are probably aware: Initially this integration is only meant
> for Maemo devices running Fennec - and a discussion of suggestion for other
> platforms should take place in the newsgroups and not here.

The bug summary/title doesn't say anything about this being specific to Maemo or Fennec. I'd certainly argue that the reasons it would be a good idea in Maemo on Fennec would certainly apply outside that platform, and that the bug/feature/task tracking system here is the perfect place for that discussion. Even it's really not, the bug should be updated to indicate the narrow scope of the discussion here.
(In reply to comment #139)
As directed, this conversation belongs in the newsgroups, not in the bug.
(In reply to comment #136)
> (In reply to comment #135)
> > (In reply to comment #121)
> > > One more comment. gst_fennecvideosink_buffer_alloc() is nonsense. Please
> > > disable that as it is. Providing a buffer_alloc only makes sense if you *don't
> > > need to do a new[] in there. Like if you have a pool of lets say 4 buffers and
> > > recycle them. It makes even more sense if those buffers come from whatever
> > > composes the final view.
> > 
> > Sorry missed giving an answer to this one previously - the reason for using the
> > buffer_alloc function is that ownership of the buffer needs to be passed on to
> > the media framework, which will use delete [] to release the memory, hence it
> > needs to be allocated with new [] - GStreamer uses alloc/free which are not
> > compatible with new/delete.
> 
> Btw, I know of the free_func member of the gst_buffer type - but it was not
> introduced until 0.10.22 which is more recent than our current build chain for
> Fennec supports and was made with the support of the N810 is mind (which has
> now been dropped as far as GStreamer video is concerned), hence the alternative
> "risky" way of handling it that currently exists in the code.  I'm considering
> alternatives...

Sorry from my side too. Did not saw that gst_fennecvideosink_put() is taking ownership of passed buffers. Not sure how easy this can be changed, but if you can change that, we could enable buffer pools on the dspdecoder. This would reduce the mmap overhead as the dsp would reuse the very same buffers. Effects would not be dramatic though ...
Flags: wanted-fennec1.0?
Does it apply to firefox 3.6?
Note when building for Meamo you need at least the FREMANTLE SDK (see: http://developer.mikek.dk/#post50 for how to set it up), this is due to use of GStreamer features that aren't found in our normal Maemo build chain.

I'll be off-line for the next couple of weeks, if anyone wants to do work on this, searching for TODO's will give an initial starting point - the main areas that need work are:

1) Ensuring thread safety, it's currently being re-done, half-done, faulty, etc.

2) Unifying the assertions, replace with NS_ABORT_IF_FALSE as appropriate.

3) In case the audio-pipeline doesn't support volume control, do on-off volume control.

4) Call start/stop progress at the correct places

5) Send time-updated events at the correct places

6) Fix seeking

7) Make the patch work for audio only streams

8) Implement missing functions in nsGStreamerLink

9) Fix the assert in: nsPresContext::NotifyInvalidation
Attachment #422961 - Attachment is obsolete: true
(In reply to comment #129)
> Created an attachment (id=422961) [details]
> Patch updated to TOT as described in the comment section

Hi guys,

A quick introduction, I'm a developer for fremantle browser. In our microb configuration we have that patch (id=422961) applied and with that we run into a nasty deadlock situation with a test page (http://jumi.lut.fi/~vanhoja/www2/misc/audio_dom_play_pause2.html). The situation basically was that a nsGStreamerInputBin was blocked in mNsMediaStream->Read while a shutdown was done. Calling mGStreamerState.Shutdown() in nsGStreamerPipeline::Shutdown resulted into a deadlock because of that unfinished read. I'll attach a patch how I ensured this kind of situation would not happen.
Seems like the makefile slipped out of my previous patch, it will take some time to generate an updated patch, so in the meantime this makefile goes into: .../content/media/gstreamer
FYI: latest Opera 10.51 Linux builds now have GStreamer backend for HTML5 video.
Perhaps a parity-opera flag for this bug now?
The identity of the backend seems irrelevant from an end user point of view (new features supported, or features lost, are what's relevant), so I don't think this should be parity-anything.
Considering it seems that there is no way that Mozilla will support any other format than Ogg Vorbis and Theora in the **** Ogg container without it being able to disclaim support for other formats through something like GStreamer, it definitely is important to make it as likely as possible to get GStreamer support integrated.

If Mozilla doesn't want to support anything other than Theora and Vorbis in the official builds, then you could just include those codecs with GStreamer bundled with the Firefox builds.

Also, WHY shouldn't we take advantage of the work of the GStreamer people to offer high quality codec and container support? GStreamer is a lot better than the current system for Firefox...
(In reply to comment #149)
> If Mozilla doesn't want to support anything other than Theora and Vorbis in the
> official builds, then you could just include those codecs with GStreamer
> bundled with the Firefox builds.
GStreamer is GPL right?  The GPL is not more permissive than the MPL/GPL/LGPL try-license Mozilla currently uses, so we couldn't ship it.
(In reply to comment #150)
> GStreamer is GPL right?  

It's LGPL.
Yes, GStreamer is LGPL, but many of the plugins are GPL. And regardless, I doubt that Mozilla would include any of the interesting plugins (i.e. gst-plugins-ffmpeg). So users would still need to install plugin packages.

Most Linux distros already have it in their repositories, so I don't see any need to ship it on those platforms.

Packaging is only really an issue on Windows systems, if Mozilla even wanted to include GST support on Windows. Builds are available from the GStreamer WinBuilds project ( http://www.gstreamer-winbuild.ylatuya.es ), so I don't think it's unreasonable to ask users to download and install it themselves.

I'm not really sure of the state of GST packages on MaxOSX, though it is a supported platform.
(In reply to comment #152)
> Yes, GStreamer is LGPL, but many of the plugins are GPL. And regardless, I
> doubt that Mozilla would include any of the interesting plugins (i.e.
> gst-plugins-ffmpeg). So users would still need to install plugin packages.
> 
> include GST support on Windows. Builds are available from the GStreamer
> WinBuilds project ( http://www.gstreamer-winbuild.ylatuya.es ), so I don't
> think it's unreasonable to ask users to download and install it themselves.

From GStreamer WinBuilds we also provide an LGPL build, which includes an LGPL version of ffmpeg (built whithout the -gpl option). This GStreamer build can decode many of the most common formats, since ffmpeg decoders are part of ffmpeg itself and don't rely on external GPL libraries, which are the ones triggering the GPL license. 
Therefore Mozilla could ship GStreamer binaries on Windows with (almost) full decoding support. The only problem is the H264 decoder, for the patents issues, but ffmpeg can also be built without this decoder, if needed.
(In reply to comment #152)
> Yes, GStreamer is LGPL, but many of the plugins are GPL. And regardless, I
> doubt that Mozilla would include any of the interesting plugins (i.e.
> gst-plugins-ffmpeg). So users would still need to install plugin packages.
> 
> include GST support on Windows. Builds are available from the GStreamer
> WinBuilds project ( http://www.gstreamer-winbuild.ylatuya.es ), so I don't
> think it's unreasonable to ask users to download and install it themselves.

From GStreamer WinBuilds we also provide an LGPL build, which includes an LGPL version of ffmpeg (built without the -gpl option). This GStreamer build can decode many of the most common formats, since ffmpeg decoders are part of ffmpeg itself and don't rely on external GPL libraries, which are the ones triggering the GPL license. 
Therefore Mozilla could ship GStreamer binaries on Windows with (almost) full decoding support. The only problem is the H264 decoder, for the patents issues, but ffmpeg can also be built without this decoder, if needed.
I would request all involved to not discuss and plan ahead of time. Let the technical problems be solved first, then the decision of including gstreamer support in the default builds can be discussed. As such it's only adding noise to this bug (which is anyway not for non-technical discussion).

Thank you.
I want to point out here that Mozilla doesn't want to ship libraries that are LGPL-only. We only ship libraries that are at least LGPL and MPL compatible.

I don't really like this policy, but that's what it is. If you want to debate it, go to the mozilla.governance newsgroup, don't bother debating it here.
(In reply to comment #156)
> I want to point out here that Mozilla doesn't want to ship libraries that are
> LGPL-only. We only ship libraries that are at least LGPL and MPL compatible.
> 
> I don't really like this policy, but that's what it is. If you want to debate
> it, go to the mozilla.governance newsgroup, don't bother debating it here.

Why ship the libraries ? Do you ship GTK+ libraries with official mozilla builds for example ?
(In reply to comment #157)
> Why ship the libraries ? Do you ship GTK+ libraries with official mozilla
> builds for example ?

No.
IMHO, I would rather use an unstable gstreamer backend that can be fixed with time, than stick to the **** flash implementation right now on linux for video playback.

For Windows, I think it is a bad idea to use gstreamer, since Windows have DirectShow which is native and has an architecture like gstreamer if I am not wrong. Plus it would be easier to integrate with Direct2D hardware acceleration which is now in trunk.
(In reply to comment #159)
> For Windows, I think it is a bad idea to use gstreamer, since Windows have
> DirectShow which is native and has an architecture like gstreamer if I am not
> wrong. Plus it would be easier to integrate with Direct2D hardware acceleration
> which is now in trunk.

yes, but then you are in wrong bug report ... go to bug 435339
Hi, 

I'm Kishore.
I'm going to work on this bug.
I tried to get the above patch details  & I searched in the http://mxr.mozilla.org/
But I couldn't find anything related to above patch.
So I need your help to start.

Can u provide me the below details.
1. Where can I find the source code?
2. How to compile & test?

(In reply to comment #143)
> Created an attachment (id=427098) [details]
> Merged to the tree: mozilla-central: changeset:   38145:38929916c6d1 mobile:
> changeset:   1341:fd2a78968403
> 
> Note when building for Meamo you need at least the FREMANTLE SDK (see:
> http://developer.mikek.dk/#post50 for how to set it up), this is due to use of
> GStreamer features that aren't found in our normal Maemo build chain.
> 
> I'll be off-line for the next couple of weeks, if anyone wants to do work on
> this, searching for TODO's will give an initial starting point - the main areas
> that need work are:
> 
> 1) Ensuring thread safety, it's currently being re-done, half-done, faulty,
> etc.
> 
> 2) Unifying the assertions, replace with NS_ABORT_IF_FALSE as appropriate.
> 
> 3) In case the audio-pipeline doesn't support volume control, do on-off volume
> control.
> 
> 4) Call start/stop progress at the correct places
> 
> 5) Send time-updated events at the correct places
> 
> 6) Fix seeking
> 
> 7) Make the patch work for audio only streams
> 
> 8) Implement missing functions in nsGStreamerLink
> 
> 9) Fix the assert in: nsPresContext::NotifyInvalidation
(In reply to comment #161)
> Hi, 
> 
> I'm Kishore.
> I'm going to work on this bug.
> I tried to get the above patch details  & I searched in the
> http://mxr.mozilla.org/
> But I couldn't find anything related to above patch.

of course, because this patch not yet applied!

you should try to apply this patch
https://bugzilla.mozilla.org/attachment.cgi?id=427098 first, and then check how it works
Kishore, <https://developer.mozilla.org/En/Developer_Guide>
For generic questions like these, please search Google or see <http://www.mozilla.org/community/forums/>, not here.
tried on youtube html5 and openvideo.dailymotion.com, but no video, in the console i see :

No of frames displayed : 0
mabye we need more debug info in the patch
ah, i forgot, firefox-3.6.3 is the version i used with the patch
attaching the patch.
Below are the updates
1. RGB to YUV conversion related stuff added.
2. colorspace conversion element removed from the pipeline. Because image layer converts from I420 to RGB, while setting the image (SetVideoData)
3. Missing Makefile.in added to the patch (content/media/gstreamer/Makefile.in)
4. videosink output format set to YUV I420
5. configure macros check added for gstreamer flags (like ifdef MOZ_GSTREAMER)
6. NS_NEW_RUNNABLE_METHOD macro added
Attachment #427098 - Attachment is obsolete: true
Attachment #434615 - Attachment is obsolete: true
Attachment #451253 - Attachment is obsolete: true
Assignee: mkristoffersen → nobody
patch was failed to apply, fixed the rejects
Attachment #461162 - Flags: review?(chris.double)
changes made on top of gstreamer mozilla original patch
Attachment #461163 - Flags: review?(chris.double)
Attached patch animation waiting logo fix (obsolete) — Splinter Review
problem was animation logo keep appears even after play-back started.
attached patch has the fix
Attachment #461165 - Flags: review?(chris.double)
Attached patch seek issue fix (obsolete) — Splinter Review
attachment has the fix for seek issue
Attachment #461166 - Flags: review?(chris.double)
Attached patch OpenGL rendering issue (obsolete) — Splinter Review
OpenGL rendering was working with the gstreamer.
The attachment has the fix
Attachment #461168 - Flags: review?(chris.double)
(In reply to comment #173)
> Created attachment 461168 [details] [diff] [review]
> OpenGL rendering issue
> 
> OpenGL rendering was working with the gstreamer.
> The attachment has the fix

Does it play videos or not yet ?
Yes It works fine now.
> 
> Does it play videos or not yet ?

Yep it plays with non-fennec (direct rendering) very well
Is it in a state where it can be merged into mozilla-central now? If not, why? If so, when?
(In reply to comment #177)
> Is it in a state where it can be merged into mozilla-central now?
yes
> If not, why?
standards
> If so, when?
as soon as possible
Thanks for the patches - It will be easier for me to review as one rolled up patch. Can you provide a rolled up patch and request review on that?
I get a compile error with these patches applied:

content/media/gstreamer/nsGStreamerPipeline.cpp:385: error: ‘GST_MESSAGE_QOS’ was not declared in this scope

Any thoughts what might be causing this?
From the gst docs:

 * @GST_MESSAGE_QOS: A buffer was dropped or an element changed its processing
 * strategy for Quality of Service reasons. Since: 0.10.29

0.10.29 is pretty new - you're probably building against a somewhat older version.
We had one problem with outdated gstreamer between maemo5/maemo6/desktop....  probably this is another one... 
what is gstreamer version do you have ?
Thanks, I'm building against the GStreamer that ships with Ubuntu 10.04. It's gst 0.10.28. Are we expecting to be able to build on current distributions?
If I run the content/media mochitests with these patches applied it crashes after the first couple of tests. Is that a known problem?
if you require a minimum version of gstreamer, configure should be checking for it.
You'll need to get someone else to review the build system changes (configure.in, etc). Maybe Ted Mielczarek (:ted)?

For the changes to gfx/ycbcr to add ARM support please split that into a bug specific for that and add me as the reviewer - I'll review/advise on the changes for that there. We can then get that sorted/landed pretty quickly I suspect.
The changes to layout/base/nsPresContext (mDOMPaintEventPending stuff) should be reviewed by a layout peer, possibly :roc. Might be worth splitting that into a separate patch for them.
The changes to gfx/layers should also be split into a separate patch and the gfx/layers peer should review it. Probably :roc again. Note that this code will require changes if bug 577843 and/or bug 583138 lands before this code.
I'll start reviewing the main content/media/gstreamer code on Monday.
build changes review for gstreamer backend support
Attachment #461486 - Flags: review?(ted.mielczarek)
Attached patch gfx/layers changes for arm (obsolete) — Splinter Review
ARM NEON optimization added for YCbCr (YV12 / I420) to RGB16_565 conversion
Attachment #461487 - Flags: review?(roc)
compilation error fix is the attached patch
Attachment #461488 - Flags: review?(roc)
as per your suggestion I've split-up the attachment 461163 [details] [diff] [review]

and the remaining patches (attachment 461165 [details] [diff] [review], 461166, 461168) are issue specific.
Attachment #461489 - Flags: review?(chris.double)
Thanks for splitting the stuff out so quickly but I think some things are missing.

Attachment 461486 [details] [diff] seem to be missing stuff. It only includes 'toolkit' but should (I think) also include configure.in changes and everything in 'config' at least. As timeless points out in comment 190 you might want to try khuey or mfield as reviewers.

Attachment 461488 [details] [diff] also seems to be missing stuff. Where are the changes related to mDOMPaintEventPending that are in attachment 461162 [details] [diff] [review]?

For attachment 461487 [details] [diff] [review] it shouldn't include the changes to gfx/ycbcr. As mentioned in comment 186 I think the gfx/ycbcr changes need to be spun into a different bug, which this one depends on, and it reviewed separately. This will facilitate upstreaming code, making required changes to the README, getting it landed quickly so it's not delayed by the changes that may need to occur in the gstreamer code.

Once those things are factored out, the remaining changes in one single patch would make for a smooth review if possible (although I will continue to review on Monday the content/media/gstreamer code without this).
(In reply to comment #195)
> Thanks for splitting the stuff out so quickly but I think some things are
> missing.
> 
> Attachment 461486 [details] [diff] seem to be missing stuff. It only includes 'toolkit' but
> should (I think) also include configure.in changes and everything in 'config'
> at least. As timeless points out in comment 190 you might want to try khuey or
> mfield as reviewers.
> 
> Attachment 461488 [details] [diff] also seems to be missing stuff. Where are the changes related
> to mDOMPaintEventPending that are in attachment 461162 [details] [diff] [review]?
attachment 461162 [details] [diff] [review] is the original mozilla patch, & I guess, it has been already reviewed, so I kept this patch as is.

I separated only changes made by me, i.e attachment 461163 [details] [diff] [review] & after

If you suggest to split-up attachment 461162 [details] [diff] [review], I'll do it.
> 
> For attachment 461487 [details] [diff] [review] it shouldn't include the changes to gfx/ycbcr. As
> mentioned in comment 186 I think the gfx/ycbcr changes need to be spun into a
> different bug, which this one depends on, and it reviewed separately. This will
> facilitate upstreaming code, making required changes to the README, getting it
> landed quickly so it's not delayed by the changes that may need to occur in the
> gstreamer code.
> 
> Once those things are factored out, the remaining changes in one single patch
> would make for a smooth review if possible (although I will continue to review
> on Monday the content/media/gstreamer code without this).
(In reply to comment #196)
> I separated only changes made by me, i.e attachment 461163 [details] [diff] [review] & after
> If you suggest to split-up attachment 461162 [details] [diff] [review] [details], I'll do it.

Yes please split up the original as suggested. I'm making the assumption that you're taking over the development of the patch and will be the one addressing the review comments - is that the case? If so you should make yourself the assignee as well. 

If that's not the case, I can't really just review only the changes you've made and not review the original. It hasn't been reviewed as far as I know since I last reviewed it a year ago and changes have been made since then.
(In reply to comment #196)
> (In reply to comment #195)
> > Thanks for splitting the stuff out so quickly but I think some things are
> > missing.
> > 
> > Attachment 461486 [details] [diff] [details] seem to be missing stuff. It only includes 'toolkit' but
> > should (I think) also include configure.in changes and everything in 'config'
> > at least. As timeless points out in comment 190 you might want to try khuey or
> > mfield as reviewers.
> > 
> > Attachment 461488 [details] [diff] [details] also seems to be missing stuff. Where are the changes related
> > to mDOMPaintEventPending that are in attachment 461162 [details] [diff] [review] [details]?
> attachment 461162 [details] [diff] [review] [details] is the original mozilla patch, & I guess, it has
> been already reviewed, so I kept this patch as is.

oh... then you have some work to do...  iirc the patch is mid-way in a rework,
see comment 143
(In reply to comment #197)
> (In reply to comment #196)
> > I separated only changes made by me, i.e attachment 461163 [details] [diff] [review] & after
> > If you suggest to split-up attachment 461162 [details] [diff] [review] [details] [details], I'll do it.
> 
> Yes please split up the original as suggested. I'm making the assumption that
> you're taking over the development of the patch and will be the one addressing
> the review comments - is that the case? If so you should make yourself the
> assignee as well. 
> 
> If that's not the case, I can't really just review only the changes you've made
> and not review the original. It hasn't been reviewed as far as I know since I
> last reviewed it a year ago and changes have been made since then.
Okay, I'll split up the original patch & I'll raise a new review request
Comment on attachment 461488 [details] [diff] [review]
Press context changes for gstreamer backend support

Wrap at column 80
Attachment #461488 - Flags: review?(roc) → review+
+  if (16 == DefaultXDisplayDepth())
+    mImageFormat = gfxASurface::ImageFormatRGB16_565;

{}

+#endif
+
+  size_t size = 0;

Don't initialize it; better to get compiler warnings when we fail to set it.

+  if (mImageFormat == gfxASurface::ImageFormatRGB16_565) {
+    size = aData.mPicSize.width*aData.mPicSize.height*2;
+  }
+  else if (mImageFormat == gfxASurface::ImageFormatRGB24) {

} else if {

+    size = aData.mPicSize.width*aData.mPicSize.height*4;
+  }

Add an "else { NS_ERROR(); return; }

+  if (mImageFormat == gfxASurface::ImageFormatRGB24)
+    gfx::ConvertYCbCrToRGB32(aData.mYChannel,

{} here too

+#ifdef __arm__
+  if (mImageFormat == gfxASurface::ImageFormatRGB16_565)
+    gfx::ConvertYCbCrToRGB565(aData.mYChannel,

I think we should instead have a #define in yuv_convert.h that simply signals whether ConvertYCbCrToRGB565 is supported. Then here we can test for that flag instead of __arm__ and convert on any platform where ConvertYCbCrToRGB565 might be present.

+// Convert a frame of YUV to 16 bit RGB565.

Probably need to document exactly what flavour of RGB565 is output here.
Attachment #461162 - Attachment is obsolete: true
Attachment #461162 - Flags: review?(chris.double)
Attachment #461162 - Attachment is obsolete: false
Attachment #461163 - Attachment is obsolete: true
Attachment #461163 - Flags: review?(chris.double)
Attachment #461486 - Attachment is obsolete: true
Attachment #462044 - Flags: review?(ted.mielczarek)
Attachment #461486 - Flags: review?(ted.mielczarek)
Attachment #461488 - Attachment is obsolete: true
Attachment #462047 - Flags: review?(roc)
Attachment #461162 - Attachment is obsolete: true
Attachment #461165 - Attachment is obsolete: true
Attachment #461166 - Attachment is obsolete: true
Attachment #461168 - Attachment is obsolete: true
Attachment #461489 - Attachment is obsolete: true
Attachment #462048 - Flags: review?(chris.double)
Attachment #461165 - Flags: review?(chris.double)
Attachment #461166 - Flags: review?(chris.double)
Attachment #461168 - Flags: review?(chris.double)
Attachment #461489 - Flags: review?(chris.double)
Now there are 4 patches
1. build patch
2. press context patch
3. gfx/layers patch
4. gstreamer alone patch

I've splitted-up original patch and merged with other patches [seek/OpenGL/animation issues]for review purpose
It is great to see so much work on this. Will this GStreamer back end be built into the version of Firefox compiled for the Windows platform?
Probably not. But, it probably will be permissible for Linux distros to enable this build option for the versions of Firefox that will come preloaded in their distros.
(In reply to comment #207)
> Probably not. But, it probably will be permissible for Linux distros to enable
> this build option for the versions of Firefox that will come preloaded in their
> distros.

So this will be included in the mainline release sources and just not enabled by default or will it remain as something that needs to be patched in?
(In reply to comment #207)
> Probably not. But, it probably will be permissible for Linux distros to enable
> this build option for the versions of Firefox that will come preloaded in their
> distros.

That would be up to the agreement set forth between Mozilla and the various distros that ship "Firefox".
Why are these prescontext changes needed for the GStreamer backend? This doesn't seem to have anything to do with GStreamer or video.
Yes, I've tested without these changes. It works fine
I think we can ignore these changes.
Note: I've taken these changes from the attachment 427098 [details] [diff] [review]. I fixed only compilation error.
(In reply to comment #210)
> Why are these prescontext changes needed for the GStreamer backend? This
> doesn't seem to have anything to do with GStreamer or video.
Bug 451674 also contains configure changes for gstreamer in a patch requesting review.
Blocks: 586598
Attachment #465180 - Flags: review?(chris.double)
Blocks: 583135
To keep you up to date with review progress - I'm currently busy with blocking Firefox 4 bugs, and am away most of next week. I'll hopefully be able to get to the review the following week depending on FF 4 workload. If it's more urgent you might need to consider an alternative reviewer.
Good to know. Who could do the review instead?
Assignee: nobody → kishore.arepalli
Comment on attachment 462044 [details] [diff] [review]
build changes for gstreamer backend [html5] support

>diff -r a4d86c3a3494 configure.in
>--- a/configure.in
>+++ b/configure.in
>+dnl ========================================================
>+dnl = Enable GStreamer
>+dnl ========================================================
>+MOZ_ARG_ENABLE_BOOL(gstreamer,
>+[  --enable-gstreamer           Enable GStreamer support],
>+MOZ_GSTREAMER=1,
>+MOZ_GSTREAMER=)
>+
>+PKG_CHECK_MODULES(GSTREAMER,
>+                  gstreamer-base-0.10 gstreamer-0.10 gstreamer-plugins-base-0.10)
>+if test -n "$GSTREAMER_LIBS"; then
>+   _SAVE_LDFLAGS=$LDFLAGS
>+   LDFLAGS="$LDFLAGS -lgstvideo-0.10"

Why is this library name hardcoded in here? Can't you pick it up from pkg-config somehow?

>+   AC_TRY_LINK(,[return 0;],_HAVE_LIBGSTVIDEO=1,_HAVE_LIBGSTVIDEO=)

Can you format this a little more nicely? Some extra newlines wouldn't hurt.

>+   if test -n "$_HAVE_LIBGSTVIDEO" ; then
>+      GSTREAMER_LIBS="$GSTREAMER_LIBS -lgstvideo-0.10"
>+   else
>+      AC_MSG_ERROR([gstreamer video backend requires libgstvideo])

We like our configure error messages now to be more useful. You should give:
* What configure options can be changed to work around this (building without --enable-gstreamer in this case)
* A link to a wiki page on developer.mozilla.org describing how to install the needed libraries if they're missing. You should give the package names for Ubuntu and Fedora, at a minimum.

>+else
>+   AC_MSG_ERROR([gstreamer backend requires the gstreamer packages])

Same here.

>diff -r a4d86c3a3494 layout/build/Makefile.in
>--- a/layout/build/Makefile.in
>+++ b/layout/build/Makefile.in
>@@ -182,16 +182,22 @@ SHARED_LIBRARY_LIBS 	+= \
> endif
> 
> ifdef MOZ_WAVE
> SHARED_LIBRARY_LIBS 	+= \
> 	$(DEPTH)/content/media/wave/$(LIB_PREFIX)gkconwave_s.$(LIB_SUFFIX) \
> 	$(NULL)
> endif
> 
>+ifdef MOZ_GSTREAMER
>+SHARED_LIBRARY_LIBS 	+= \
>+	$(DEPTH)/content/media/gstreamer/$(LIB_PREFIX)gkcongstreamer_s.$(LIB_SUFFIX) \
>+	$(NULL)
>+endif

I know you're just copying from within this file, but our current Makefile style rules forbid using tabs where they're not necessary. Please use a two-space indent on the line continuations here.
Attachment #462044 - Flags: review?(ted.mielczarek) → review-
(In reply to comment #216)
> Comment on attachment 462044 [details] [diff] [review]
> build changes for gstreamer backend [html5] support
> 
> >diff -r a4d86c3a3494 configure.in
> >--- a/configure.in
> >+++ b/configure.in
> >+dnl ========================================================
> >+dnl = Enable GStreamer
> >+dnl ========================================================
> >+MOZ_ARG_ENABLE_BOOL(gstreamer,
> >+[  --enable-gstreamer           Enable GStreamer support],
> >+MOZ_GSTREAMER=1,
> >+MOZ_GSTREAMER=)
> >+
> >+PKG_CHECK_MODULES(GSTREAMER,
> >+                  gstreamer-base-0.10 gstreamer-0.10 gstreamer-plugins-base-0.10)
> >+if test -n "$GSTREAMER_LIBS"; then
> >+   _SAVE_LDFLAGS=$LDFLAGS
> >+   LDFLAGS="$LDFLAGS -lgstvideo-0.10"
> 
> Why is this library name hardcoded in here? Can't you pick it up from
> pkg-config somehow?
> 
Nope, this is normal and also recommended in the gstreamer documentation. It avoids linking to all helper libraries in the package.
Attached patch gst build changes (obsolete) — Splinter Review
Attached patch gstreamer media decoder module (obsolete) — Splinter Review
Comment on attachment 461487 [details] [diff] [review]
gfx/layers changes for arm

See comment #201
Attachment #461487 - Flags: review?(roc) → review-
Comment on attachment 465180 [details] [diff] [review]
finding supported gstreamer codec/demuxer plugins

>+    if (nsnull == supportedCodecs) {
>+#ifdef MOZ_GSTREAMER
>+      NS_ConvertUTF16toUTF8 CodecTypeUTF8(token);
>+      if (!IsGstSupportedType(CodecTypeUTF8.get()))
>+        return CANPLAY_NO;
>+#endif
>+    } else {
>+      if (!CodecListContains(supportedCodecs, token)) {
>+        // Totally unsupported codec
>+        return CANPLAY_NO;
>+      }

If MOZ_GSTREAMER is mot defined then we have the empty if condition and the else is ever executed. Can you restructure this to avoid that.

>+ * The Initial Developer of the Original Code is the Mozilla Corporation.

Apparently this should be 'Mozilla Foundation' not 'Mozilla Corporation'. I realize lots of existing media code probably still has Corporation sadly. You should take the header from the official boilerplate rather than copying it from other files. See here:

http://weblogs.mozillazine.org/gerv/archives/2010/02/mpl_initial_developer_for_mozilla_employ.html

>+ * Portions created by the Initial Developer are Copyright (C) 2007

Change year from 2007 to 2010.

>+struct _GstCodecDetails {
>+ char klass[255]; // video/audio/demux
>+ char feature[25]; //mimetype
>+};

Use 2 space indents.

>+  if (g_strrstr (gst_element_factory_get_klass (f), ((GstCodecDetails *)data)->klass)) {

Use C++ style casts. static_cast, reinterpret_cast, etc. This applies everywhere you are doing casts.

>+    GstStaticPadTemplate *pad_template;
>+    GList *template_list = f->staticpadtemplates;
>+    for (pad_template = (GstStaticPadTemplate *) template_list->data;
>+         pad_template;
>+         pad_template = (template_list = template_list->next) ? (GstStaticPadTemplate *) template_list->data : NULL) {
>+      if (g_strrstr(pad_template->name_template, "sink") && g_strrstr(pad_template->static_caps.string, ((GstCodecDetails *)data)->feature)) {
>+        NS_GSTREAMER_LOG(("caps:%s, feature:%s\n", pad_template->static_caps.string,  GST_PLUGIN_FEATURE_NAME(feature)));
>+        return TRUE;
>+      }
>+    }
>+  }
>+  return FALSE;
>+}

Restructure code so it fits within 80 characters.

>+  /* Initialisation */
>+  gst_init_check(0, 0, NULL);

The docs for gst_init_check say:

  "This function should be called before calling any other GLib functions. "

How are you ensuring this is the case? Also, check return value.
Attachment #465180 - Flags: review?(chris.double) → review-
Comment on attachment 462048 [details] [diff] [review]
gstreamer changes alone in single patch

>+# The Initial Developer of the Original Code is the Mozilla Corporation.
>+# Portions created by the Initial Developer are Copyright (C) 2007

Initial developer should be Mozilla Foundation and update copyright to 2010. This change should be done to all the license headers included in the patch.

>+// Define the max size of video elements in pixels
>+// Copied from the OGG decoder (Jan 29 - 2010)
>+#define MAX_VIDEO_WIDTH  4000
>+#define MAX_VIDEO_HEIGHT 3000

This should probably be declared somewhere in content/media so it can be shared between the decoders.

>+// Define how much data we should try to read at a time,
>+// changing the value can potentially decrease or increase performance.
>+// Feel free to experiment
>+#define GST_BUFFER_READ_CHUNK_SIZE 1024*16

Should have brackets: (1024*16)

>+// Check that we are compatible with the GStreamer version (we wish to support
>+// the build system for the N800, but can't actually run on something older
>+// than 0.10.25 (as in the Nokia N900)
>+#if 10 > GST_VERSION_MINOR || (22 > GST_VERSION_MICRO && 10 == GST_VERSION_MINOR)
>+#error We do not support versions of the GStreamer library previous to 0.10.22 \
>+  while building, and previous to 0.10.25 when running.
>+#endif

Is this checked in 'configure' as well?

>+  mUnknownTypeSignal = g_signal_connect( mDecodeBin,
>+                                         "unknown-type",
>+                                         G_CALLBACK(DecodeBinUnknownType),
>+                                         this);

Remove spacing after the bracket and first function parameter. This happens in another place too.

>+void nsGStreamerDecodeBin::DecodeBinNewPadAdded(void *,
>+                                                GstPad * aPad,
>+                                                gboolean aIsLast,
>+                                                nsGStreamerDecodeBin * aMe)

Align the '*' immediately after the type name. ie. no spacing:

GstPad* aPad,
nsGStreamerDecodeBin* aMe)

Make this change everywhere this appears.

>+{
>+  // Check pre-conditions
>+  PR_ASSERT(aMe);
>+  PR_ASSERT(aMe->mPadDest);

Why use PR_ASSERT instead of NS_ equivalents?

>+NS_IMETHODIMP nsGStreamerDecoder::Observe(nsISupports *aSubjet,
>+                                      const char *aTopic,
>+                                      const PRUnichar *someData)

Align arguments.

>+  // Forwarder function to enable access to a protected member of the base class
>+  void SetVideoData(PRInt32 aWidth,
>+                    PRInt32 aHeight,
>+                    float aFramerate,
>+                    float aAspectRatio,
>+                    Image *aImage)
>+                    {
>+                      gfxIntSize aSize;
>+                      aSize.width=aWidth;
>+                      aSize.height=aHeight;
>+
>+                      nsMediaDecoder::SetVideoData(aSize,
>+                                                   aAspectRatio,
>+                                                   aImage);
>+                    };

Drop { .. } to align with 'void'.

>+  // Define cleanup macro
>+#define NSGSTREAMER_INPUT_BIN_INIT_RETURN_FAILURE(arg) \
>+  NS_GSTREAMER_LOG(("Failure due to : " arg )); \
>+  Shutdown(); \
>+  return PR_FALSE;

I'm not a fan of declaring macros inside functions since their scope is global. Put this at the top of the file in global scope. It can have a shorted name too.

>+  // Retrieve the src pad of the app source element
>+  GstPad *pad = gst_element_get_static_pad(mAppSrc, "src");
>+
>+  if(!pad) {
>+    NSGSTREAMER_INPUT_BIN_INIT_RETURN_FAILURE("Couldn't get static pad from "
>+                                              "mAppSrc element\n");
>+  }
>+
>+  // Create a ghost pad
>+  GstPad *ghostPad = gst_ghost_pad_new("src", pad);
>+
>+  if(!ghostPad) {
>+    NSGSTREAMER_INPUT_BIN_INIT_RETURN_FAILURE("Couldn't create ghost pad\n");
>+  }

The 'pad' needs to be unrefed here since the macro results in a return.

>+  // Add the ghost pad to the input bin
>+  if(!gst_element_add_pad(GST_ELEMENT_CAST(mInputBin), ghostPad)) {
>+    NSGSTREAMER_INPUT_BIN_INIT_RETURN_FAILURE("Couldn't attach pad to bin");
>+  }

Does the ghostPad need to be unrefed here?

>+  // Obtain lock
>+  PR_Lock(mElementWrapperLock);
>+  // Store the element wrapper
>+  mElementWrapper = aElementWrapper;
>+  // Release lock
>+  PR_Unlock(mElementWrapperLock);

For this and all other uses of PR_Lock/PR_Unlock, use nsAutoLock.

>+    // The data source data can be read from
>+    nsMediaStream *mNsMediaStream;

No need for the 'Ns' in the name. Just call it mMediaStream or something like that.

>+    // TODO: Add some thread protection to the below two memebers

Spelling of 'members'

>+    // See if we can initialize the lib
>+    if (gst_init_check(0, 0, &gstreamerErrorText)) {

gst_init_check has this requirement:

 "This function should be called before calling any other GLib functions"

How are you managing this?

>+      if (0 == major && 10 == minor && 25 <= micro) {

Throughout the GStreamer code checks are done this way, with the value on the left hand side of the boolean operator. None of the other media code does it this way and it's jarring to move between the two different styles. I'd prefer it if you changed all the checks to match existing usage.

>+  if (!mPipeline) {
>+    NS_WARNING("Function called without a sucessfull init first");
>+    return NS_ERROR_UNEXPECTED;
>+  }

Can this (and everywhere else it is repeated) just be an NS_WARN_IF or similar macro?

>+// Dump pipeline to .dot file
>+//  GST_DEBUG_BIN_TO_DOT_FILE_WITH_TS(GST_BIN(static_cast<GstElement*>(*mPipeline)), GST_DEBUG_GRAPH_SHOW_STATES, 

"PipelineOutput");
>+// Dump pipeline to stdout
>+  NS_GSTREAMER_LOG(("The pipeline:\n"));
>+  if (getenv("GST_PIPE"))
>+    DumpGStreamerElement(static_cast<GstElement*>(*mPipeline));

Make this included in a debug only build

>+// TODO:  Remove the below global variables, they are for debug only
>+int sSetRGBCalled = 0;
>+int sDraws = 0;

Define for debug mode only?

>+class nsGStreamerPipeline :
>+  private nsGStreamerDecodeBinPadInterface,
>+  public nsISupports
>+{
> ...
>+    // Get the lenght of the stream (if available)
>+    float GetDuration();

length in comment spelt wrong.

>+
>+  // Pass the data on
>+  aMe->mVideoSinkInterface->SetVideoData(
>+      GST_VIDEO_SINK_WIDTH(aMe),
>+      GST_VIDEO_SINK_HEIGHT(aMe),
>+      aMe->mFpsN / aMe->mFpsD,
>+      1.0,
>+      GST_BUFFER_DATA(aBuffer));

Can aMe->mFpsD be zero here? If so, need to guard against divide by zero.

>+  if (!gst_structure_get_int(structure, "width", &width) ||
>+      !gst_structure_get_int(structure, "height", &height)) {
>+    GST_WARNING_OBJECT(fennecvideosink,
>+                       "invalid caps for buffer allocation %"
>+                       GST_PTR_FORMAT,
>+                       aCaps);
>+    return GST_FLOW_NOT_NEGOTIATED;
>+  }
>+
>+
>+  // Allocate a buffer with new
>+  PRUint8 *newBuffer = new PRUint8 [aSize];

Is there a need for a safety check here to ensure that very large videos don't cause OOM? Or is that already handled 
earlier by a caller?

Have you run the video mochitests? I get crashes when building with only the gstreamer backend and running the tests.

Since this patch was originally written the media code was refactored to make writing backends easier. In particular generic code exists to handle raising the correct events at the right times. It would be a lot of work but you might like to consider refactoring on top of that code. See the Raw and WebM backends for examples. This doesn't need to be done to get review or landed I just thought I'd mention it.
Attachment #462048 - Flags: review?(chris.double) → review-
>+NS_IMETHODIMP nsGStreamerDecoder::Observe(nsISupports *aSubjet,

please spell Subject correctly

>+                                      const PRUnichar *someData)

this should be aData - and this convention should be followed for all arguments to functions...
Attached patch Latest Gstreamer patch (obsolete) — Splinter Review
Latest gstreamer patch from maemo6 patch queue.
This probably still has some missing review comments. but I'll post it here: compatible with http://hg.mozilla.org/mozilla-central/rev/7fcae0c7f36a

I see one useful point here is:
1) possibility to integrate gstreamer codecs (DSP optimized webm/ogg)
2) useful for some XULRunner media applications in order to play all gstreamer available codecs.
Attachment #455627 - Attachment is obsolete: true
Attachment #461487 - Attachment is obsolete: true
Attachment #462044 - Attachment is obsolete: true
Attachment #462047 - Attachment is obsolete: true
Attachment #462048 - Attachment is obsolete: true
Attachment #465180 - Attachment is obsolete: true
Attachment #479361 - Attachment is obsolete: true
Attachment #479363 - Attachment is obsolete: true
Attachment #479364 - Attachment is obsolete: true
Attachment #479365 - Attachment is obsolete: true
Comment on attachment 503489 [details] [diff] [review]
Latest Gstreamer patch

+static GstFlowReturn gst_fennecvideosink_buffer_alloc(GstBaseSink* aBsink,
+{
+  // Allocate a buffer with new
+  void *newBuffer;

+  // Allocating 128 byte aligned memory.
+  if (posix_memalign(&newBuffer, 128, ROUND_UP(aSize, 128)) != 0)
+    newBuffer = NULL;

+  if (!*aBuf) {
+    // Release the mem we got
+    delete [] newBuffer;

coverity notes that delete (array) is wrong for void*. your allocator here is posix memalign, so the release should be posix-something not c++ delete
Attachment #503489 - Flags: review-
(In reply to comment #227)
> Comment on attachment 503489 [details] [diff] [review]
> Latest Gstreamer patch
> 
> +static GstFlowReturn gst_fennecvideosink_buffer_alloc(GstBaseSink* aBsink,
> +{
> +  // Allocate a buffer with new
> +  void *newBuffer;
> 
> +  // Allocating 128 byte aligned memory.
> +  if (posix_memalign(&newBuffer, 128, ROUND_UP(aSize, 128)) != 0)
> +    newBuffer = NULL;
> 
> +  if (!*aBuf) {
> +    // Release the mem we got
> +    delete [] newBuffer;
> 
> coverity notes that delete (array) is wrong for void*. your allocator here is
> posix memalign, so the release should be posix-something not c++ delete

For performance reasons it would be really good if the sink could manage a list of buffers and receycle them. Ideally the memory would point to a location where fennec would render from so that no extra dcopy or other data transfer would be needed.
Would it be reasonable to add GStreamer support as a non-default compile time option for desktop FF releases?
(In reply to comment #229)
> Would it be reasonable to add GStreamer support as a non-default compile time
> option for desktop FF releases?
I would like to have it too, possibly it may bring possibility to create DSP optimized codecs support and make xulrunner codebase usable for non-FF applications like songbird or similar
(In reply to comment #229)
> Would it be reasonable to add GStreamer support as a non-default compile time
> option for desktop FF releases?

This is what I hope will be the result of this bug. GStreamer support available as a compile time option, turned off by default.
If people don't mind, I'd like to request that this bug be limited to technical comments on the current work and not political comments or requests for features.

If people have questions beyond the actual technical code, I respectfully request that they contact people via email, irc, or some other means.

Note: Stefan's comments are on topic since they're technical review points. The next four comments (including this one) are off-topic.
Attached patch Latest tip 49884897bb5c patch (obsolete) — Splinter Review
Attachment #503489 - Attachment is obsolete: true
Just checking on the status of this bug.  The last patch review seems to have occurred about 11 months ago, and the latest patch appeared 4 months ago.  What issues remain that prevent merging the current patch?
I'd like to have this, I think GStreamer would bring two good things to Firefox: FFmpeg and PulseAudio, which is nice for HTML5 video, etc.

e.g.

Firefox -> GStreamer -> FFmpeg
Firefox -> GStreamer -> PulseAudio

Can't wait.

Is there a browser that already have GStreamer support?
(In reply to Diego Viola from comment #235)
> Is there a browser that already have GStreamer support?

Browsers that use webkit-gtk (epiphany, midori, etc) use GStreamer for <audio> and <video>.

PS: this subject is off-topic for this bug, so let us not submit any more comments about this here.
When is this going to be landed anywhere? A very large piece work have done, and Firefox would have a lot of cool multimedia features (choosing audio back-end without recompilation, hardware video acceleration through experimental va-api gstreamer modules, ffmpeg support etc) at least on Linux with gstreamer installed, if you landed it. Do you still plan adding compile-time option?
I completely agree. I'm aware of all the comments telling us to "stay on topic" but at this point, that amounts to telling us to "be content with the fact that a perfectly good patch is collecting dust." If the patch is still not ready, please give Oleg a list of ATTAINABLE GOALS explaining how it needs to be improved. Since months have gone by without this being posted, I can only assume that the developers with commit access are either ignoring this patch or discussing it behind closed doors.

The longer they wait to add this option and make it enabled by default, the more Firefox will lag behind the newer browsers.
Bug 714408 adds support for a media plugin framework for HTML video. Whilst it is b2g specific at the moment it'd be interesting to refactor the gstreamer support into that framework so that it'd provide dynamic loading of gstreamer support if it was available.

This would also involve refactoring to fit in with the 'builtin decoder' changes that were made when WebM was supported and cut down on the amount of code in the patch itself since many things like event handling would be provided by the builtin code.
(In reply to Connor Behan from comment #238)
> If the patch is still
> not ready, please give Oleg a list of ATTAINABLE GOALS explaining how it
> needs to be improved. Since months have gone by without this being posted, I
> can only assume that the developers with commit access are either ignoring
> this patch or discussing it behind closed doors.

No one has asked for review on the patch recently which is probably why it hasn't had any feedback since the last review cycle. Running the media mochitests with this patch applied will also give an indication of how close it is to being complete (I have not done this recently).
My opinion is that we shouldn't land these patches at this time.

This is essentially for policy reasons:

Running codecs beyond the minimal set we support presents a larger surface to security review.

Running codecs which other browsers don't support doesn't help web authors. Our goal is not to inspire web pages 'Best viewed in Firefox'. Quite the opposite. So adding the wide range of formats gstreamer offers doesn't advance the web as a platform.

These are the main reasons we haven't added support for codecs like speex, flac, opus, jpeg2000, jpeg-xr, webp, etc. We have a wav reader in mozilla-central for debugging, but it's disabled by default.

We have for some years been holding the line against for royalty-free codecs in HTML against very strong commercial and market pressure, because the patent terms available for otherwise popular formats like mp3 and mp4 are incompatible with our principles of user freedom. Requiring an alternative while flash adoption is falling is an important check against those commercial interests. Hooking into platform-level support for those codecs would greatly weaken that stance.

I appreciate the work that's been done on these patches, but unlike doublec I don't think the code should be in the tree, even disabled by default. I think the bug should stay open in case we are forced to reverse our position on mp4, in which case we'll probably want this code in hurry.
This patch was hacked by 3-4 different people. and it is a bit outdated.
Right way to add gstreamer support is open interface which would allow to use external codecs (HW accelerated, gstreamer et.c.) so community could maintain media plugins separately. and IIUC Bug 714408 exactly implementing that interface.
> think the bug should stay open in case we are forced to reverse our position on mp4

Support for random proprietary codecs and unwanted formats like mp4 is one side of the problem.
Most important reason why we should have media plugin interface, is that different mobile platforms might have HW accelerated decoding for WebM and Ogg, and the only way to use these accelerated codecs implementation is media interface.
Kinetik pointed out the wav decoder is actually built by default, there's just no media document for it. It works as a source for an <audio> element. It's the support for 'raw' video data which is disabled by default.
(In reply to Ralph Giles (:rillian) from comment #241)
> Running codecs which other browsers don't support doesn't help web authors.
> Our goal is not to inspire web pages 'Best viewed in Firefox'. Quite the
> opposite. So adding the wide range of formats gstreamer offers doesn't
> advance the web as a platform.

But these are formats that other browsers DO support. It's the main reason that Firefox has been losing so much ground to Chrome. Users aren't happily taking up your mission of resisting H.264 and most of them don't even know why H.264 videos are failing to play for them. I've heard many people shrug it off as "firefox is buggy with video, use chrome."

> We have for some years been holding the line against for royalty-free codecs
> in HTML against very strong commercial and market pressure, because the
> patent terms available for otherwise popular formats like mp3 and mp4 are
> incompatible with our principles of user freedom.

And linking to Flash in a plugin suggestion box isn't? The whole point of this patch is that it will absolve Firefox of the dilemma to support non-free codecs. If you already had your mind made up that you would not support extra codecs even in this indirect way, I think that is unfair to the people that have worked **** this patch, probably because they thought it had a chance of being accepted before the mp4 patent expires.
(In reply to Ralph Giles (:rillian) from comment #241)
> My opinion is that we shouldn't land these patches at this time.

I think your main points are:
1) Holding the line against for royalty-free codecs
2) Security issues

I'm not sure, whether using gstreamer would HAVE to result in mp4 support. You could probably limit gstreamer to specific container/codec combinations on purpose. Also, including another library always includes security risks.

However, I'm not convinced that FireFox does the best it can to playback videos. Especially on older systems, I expected HTML5 to be faster than Flash. Turns out, HTML5 on youtube is worse than Flash - at least in FireFox. Firefox uses slightly more CPU and the X server process consumes more than twice as much CPU as when using Flash. Using a native player consumes less than half the CPU time flash needs and of course much less than half the HTML5 player on youtube.

As far as I understand, it was also raised in this thread that using gstreamer can have advantages in those areas as well (vaapi, vdpau, opengl?). I leave it up to you, whether improvements in these areas are to be achieved with or without gstreamer.
(In reply to Oleg Romashin (:romaxa) from comment #242)
> This patch was hacked by 3-4 different people. and it is a bit outdated.
> Right way to add gstreamer support is open interface which would allow to
> use external codecs (HW accelerated, gstreamer et.c.) so community could
> maintain media plugins separately. and IIUC Bug 714408 exactly implementing
> that interface.

The gstreamer patch is only "outdated" because Mozilla let it rot in the bugtracker rather than committing it a year ago. Bug 714408 may have a better API but it is entirely unclear if anyone will take on the task of porting it a non-mobile platform in the foreseeable future. And when someone does it will probably face the same political barriers that are stopping the patch from being ready now.
(In reply to Connor Behan from comment #245)
> (In reply to Ralph Giles (:rillian) from comment #241)
> > Running codecs which other browsers don't support doesn't help web authors.
> > Our goal is not to inspire web pages 'Best viewed in Firefox'. Quite the
> > opposite. So adding the wide range of formats gstreamer offers doesn't
> > advance the web as a platform.
> 
> But these are formats that other browsers DO support. It's the main reason
> that Firefox has been losing so much ground to Chrome. Users aren't happily
> taking up your mission of resisting H.264 and most of them don't even know
> why H.264 videos are failing to play for them. I've heard many people shrug
> it off as "firefox is buggy with video, use chrome."

http://blog.chromium.org/2011/01/html-video-codec-support-in-chrome.html
(In reply to Josh Triplett from comment #248)
> (In reply to Connor Behan from comment #245)
> > (In reply to Ralph Giles (:rillian) from comment #241)
> > > Running codecs which other browsers don't support doesn't help web authors.
> > > Our goal is not to inspire web pages 'Best viewed in Firefox'. Quite the
> > > opposite. So adding the wide range of formats gstreamer offers doesn't
> > > advance the web as a platform.
> > 
> > But these are formats that other browsers DO support. It's the main reason
> > that Firefox has been losing so much ground to Chrome. Users aren't happily
> > taking up your mission of resisting H.264 and most of them don't even know
> > why H.264 videos are failing to play for them. I've heard many people shrug
> > it off as "firefox is buggy with video, use chrome."
> 
> http://blog.chromium.org/2011/01/html-video-codec-support-in-chrome.html

Ok, I stand corrected.

Nevertheless, a chromium user who is serious about HTML5 video can download chromium-codecs-ffmpeg from a (most likely unofficial) repository. A firefox user who wants the same thing has to download gstreamer-ffmpeg from an unofficial repository and recompile a patched firefox. So ground is probably still being lost.
(In reply to Ralph Giles (:rillian) from comment #241)
> My opinion is that we shouldn't land these patches at this time.
> 
> This is essentially for policy reasons:
[..]
> We have for some years been holding the line against for royalty-free codecs
> in HTML against very strong commercial and market pressure, because the
> patent terms available for otherwise popular formats like mp3 and mp4 are
> incompatible with our principles of user freedom. 

Hasn't it occured to you that people would like to make their own choices about codec support? Is it because you think that your own userbase is not principled enough for you that you force your choice on all of us?
(In reply to Lars Olafsen from comment #250)
> (In reply to Ralph Giles (:rillian) from comment #241)
> > My opinion is that we shouldn't land these patches at this time.
> > 
> > This is essentially for policy reasons:
> [..]
> > We have for some years been holding the line against for royalty-free codecs
> > in HTML against very strong commercial and market pressure, because the
> > patent terms available for otherwise popular formats like mp3 and mp4 are
> > incompatible with our principles of user freedom. 
> 
> Hasn't it occured to you that people would like to make their own choices
> about codec support? Is it because you think that your own userbase is not
> principled enough for you that you force your choice on all of us?

Mozilla's policy on codec support has nothing to do with what users choose; it affects what *sites* choose.  Currently, sites provide WebM video as well, because several browsers only support WebM.  If most or all browsers support H.264, sites will only provide H.264 video.
(In reply to Ralph Giles (:rillian) from comment #241)
> Running codecs which other browsers don't support doesn't help web authors.
> Our goal is not to inspire web pages 'Best viewed in Firefox'. Quite the
> opposite. So adding the wide range of formats gstreamer offers doesn't
> advance the web as a platform.

Why not? Nowadays web becomes a platform for new OSs. And although we could play unsupported web video in media player (what I do with h264 videos in the web) on traditional PC, on web OS browser itself becomes a multimedia player. I think, supporting as much as possible video formats is a good idea in that case. It even helps to increase popularity of the new html5 video technology against the proprietary flash. 
 
> We have for some years been holding the line against for royalty-free codecs
> in HTML against very strong commercial and market pressure, because the
> patent terms available for otherwise popular formats like mp3 and mp4 are
> incompatible with our principles of user freedom. Requiring an alternative
> while flash adoption is falling is an important check against those
> commercial interests. Hooking into platform-level support for those codecs
> would greatly weaken that stance.

Why do you think adding gstreamer support will kill free WebM codec? There is still a lot of people that cannot use non-free formats in gstreamer and if web developer want to make a page playable in every browser he should still add WebM/Ogg coded media. The other side of the coin is that a huge amount of people may use non-free media formats without any law restrictions and they will probably migrate to Chrome if you don't add an ability to play these formats.
The only practical disadvantage of landing this patch is that such architecture is really bad. Maybe, it should be modular: support of free formats in browser itself and something like plug-in/extention for others (code for supporting free formats could be also converted to a module, but developed and supported by Mozilla). Such API would be very suitable. It would be possible to write backend for native codec support on every OS and keep Firefox as a single project with no branches like 'with gstreamer support', 'with Windows codecs support' etc. Gstreamer support in this case may be used as a first example of such module. Is this possible?
(In reply to konstartyom from comment #252)
> Why do you think adding gstreamer support will kill free WebM codec? 

Because WebM sucks. Its far inferior to h.264
This is really not the place for this discussion. Please take it elsewhere.
> (In reply to Ralph Giles (:rillian) from comment #241)

> As far as I understand, it was also raised in this thread that using
> gstreamer can have advantages in those areas as well (vaapi, vdpau,
> opengl?). I leave it up to you, whether improvements in these areas are to
> be achieved with or without gstreamer.

It's true that gstreamer could let us adopt technologies like that more quickly, since the maintenance burden would be shared. Especially if we were able to use Gstreamer on more than traditional GNU/Linux systems.

However, one of the reasons playback in the browser is slower than a dedicated player is that one can do more with the output, such as css transformations, html rendered on top, etc. We'd like to make that faster, and I expect that will happen over time, but right now it's slower because it's not just decoding straight to the application's compositor buffer.
Yes, Flash chose to make itself slow in that regard: http://swfdec.freedesktop.org/wiki/FAQ Supporting these relatively useless features came at a large performance cost. I was kind of hoping that HTML5 wouldn't make the same mistake :S
tl;dr: I'm attaching a new, simpler patch that adds gst decoding
support to firefox, based on the recent(ish) gecko
nsBuiltinDecoder* interfaces. Hopefully this version will be
easier to review, merge and maintain.

I'm a gst developer and I deal with embedded devices a lot. I
recently started working on this patch as I'm interested in
porting b2g to new devices. Also more generally, i'd like to see
firefox do media playback as efficiently as possible on embedded
devices, making use of gst's hw support.

Two weeks ago, after the GStreamer SDK effort was announced
(http://is.gd/tteunv), I decided to pick up the existing firefox
gst patch.  After getting familiar with it and the gecko media
code, and after an half failed rebase attempt (content/media/ has
changed significantly and the patch has bitrotten a bit), I
decided to write a new gst decoder based on the recent
nsBuiltinDecoder* interfaces in gecko.

My approach is deliberately simpler than the older patch. My goal
was to just get hw decoding support through gst, delegating the
rest of the work (i/o, synchronization, rendering, state
management etc) to the existing media code like the WebM and OGG
decoders do.

The code is self contained and 99% of it is in the
nsGStreamerReader class in
content/media/gstreamer/nsGStreamerReader.cpp. In terms of lines
of code and complexity, it's comparable to nsWebMReader and
nsOggReader. 

Internally it uses the following gst pipeline: appsrc !
decodebin2 name=d d. ! ffmpegcolorspace !
video/x-raw-yuv,format=I420 ! appsink d. ! audioconvert !
audioresample ! appsink.

In english: it feeds data from gecko to gstreamer using appsrc,
decodes using the decodebin2 element and feeds a/v back to
firefox using appsink elements.

The patch adds an --enable-gstreamer opt-in switch to configure.
When enabled, it currently takes over WebM and OGG decoding and
adds H264 support (see my closing note efore starting flames on
this please). Eventually i think i'd like to make it
takeover WebM and OGG decoding depending on runtime prefs.

I tested the patch on OSX. I've got accelerated h264 decoding
(supported by the gst applemedia plugin) working but it requires
a colorspace conversion currently, which I plan to fix soonish in
the plugin. After that, my goal is to make firefox mobile do hw
accelerated decoding on android using the gst-omx plugin. 

There are a few known bugs at this point, the major one being
that I need to implement efficient buffer allocation. Also the
the bytes-read, frames-parsed/decoded stats are not implemented
yet. Being my first firefox patch, there are probably also plenty
of stylistic fixes needed. Anyway, overall I think it's good for
a first review.

Please note: i don't want to get into the argument of whether
firefox should support h264 decoding or not. The patch currently
adds h264 decoding only so that I could test and demo accelerated
decoding on OSX. I think decoding using gst has merits regardless of
whether h264 ends up being supported or not. I'm more than
willing to remove h264 support from the patch if that makes it
easier to get it merged.
Comment on attachment 605705 [details] [diff] [review]
nsBuiltinDecoder* based implementation

Alessandro, did you want someone to review this patch? If so, you'll need to request review from a module owner or peer on the patch's details page.
Attachment #605705 - Flags: review?
Attachment #605705 - Flags: review? → review?(cpearce)
Comment on attachment 605705 [details] [diff] [review]
nsBuiltinDecoder* based implementation

>+  "video/3gpp"

Does IE9 support this MIME type? If not, we probably shouldn't, either, to avoid format proliferation.

(Thank you for the patch. Very topical for the discussion in dev-platform.)
Comment on attachment 605705 [details] [diff] [review]
nsBuiltinDecoder* based implementation

Passing review to Chris Double, he's much more knowledgeable about GStreamer than I.
Attachment #605705 - Flags: review?(cpearce) → review?(chris.double)
Comment on attachment 605705 [details] [diff] [review]
nsBuiltinDecoder* based implementation

Review of attachment 605705 [details] [diff] [review]:
-----------------------------------------------------------------

To build on linux the patch needs some gstreamer header files added to config/system-headers:

diff --git a/config/system-headers b/config/system-headers
index c0f0221..1486194 100644
--- a/config/system-headers
+++ b/config/system-headers
@@ -1055,3 +1055,7 @@ ogg/os_types.h
 nestegg/nestegg.h
 cubeb/cubeb.h
 #endif
+gst/gst.h
+gst/app/gstappsink.h
+gst/app/gstappsrc.h
+gst/video/video.h

The patch looks great. I've made some comments. The 'r-' is mostly minor stuff, and because you've got more to do to complete it. I'm definitely happy with the direction and approach of the patch though. You'll want to add some .mp4 specific files in content/media/tests btw, and make sure tests pass. This might be challenging if gstreamer is handling the webm/ogg/wave, so I'm wondering if it's better than these are handled by the internal code even if gstreamer is enabled.

::: content/media/gstreamer/Makefile.in
@@ +31,5 @@
> +# and other provisions required by the GPL or the LGPL. If you do not delete
> +# the provisions above, a recipient may use your version of this file under
> +# the terms of any one of the MPL, the GPL or the LGPL.
> +#
> +# ***** END LICENSE BLOCK *****

MPL boilerplate for new files should use the new MPL 2 text. This should be done for all new files that you've added. Don't change the boilerplate of existing files you're just modifying. http://www.mozilla.org/MPL/headers/

::: content/media/gstreamer/nsGStreamerReader.cpp
@@ +116,5 @@
> +}
> +
> +nsresult nsGStreamerReader::Init(nsBuiltinDecoderReader* aCloneDonor)
> +{
> +  gst_init_check(0, 0, NULL);

Check return value and maybe disable gstreamer support if it fails.

@@ +124,5 @@
> +        "stream-type", GST_APP_STREAM_TYPE_SEEKABLE,
> +        "max-bytes", 51200,
> +        NULL));
> +  gst_app_src_set_callbacks(mSource, &mSrcCallbacks, (gpointer) this, NULL);
> +  mDecodebin = gst_element_factory_make("decodebin2", NULL);

Is it needed to handle the failure case of gst_element_factory_make (and similar functions) when they return NULL, or does GStreamer gracefully handle the later use of NULL elements?

@@ +224,5 @@
> +      GST_FORMAT_TIME, timestamp);
> +  timestamp = GST_TIME_AS_USECONDS(timestamp);
> +  PRInt64 duration;
> +  if (GST_CLOCK_TIME_IS_VALID(GST_BUFFER_DURATION(buffer)))
> +    duration = GST_TIME_AS_USECONDS(GST_BUFFER_DURATION(buffer));

If this fails, duration will be uninitialized?

@@ +226,5 @@
> +  PRInt64 duration;
> +  if (GST_CLOCK_TIME_IS_VALID(GST_BUFFER_DURATION(buffer)))
> +    duration = GST_TIME_AS_USECONDS(GST_BUFFER_DURATION(buffer));
> +  PRInt32 frames = (GST_BUFFER_SIZE(buffer) / 4) / mInfo.mAudioChannels;
> +  PRInt64 offset = 0;

Might need to check for mInfo.mAudioChannels being zero here, or when it is initially set in the preroll callback later.

@@ +231,5 @@
> +  
> +  LOG(PR_LOG_ERROR, ("decoded audio buffer %"GST_TIME_FORMAT,
> +        GST_TIME_ARGS(timestamp * GST_USECOND)));
> +
> +  AudioDataValue *data = new AudioDataValue[frames * mInfo.mAudioChannels];

Check against the possiblity of overflow. In particular some gcc versions have a bug whereby operator new doesn't check for overflow if the allocation count times the size of the object overflows. We use PR_STATIC_ASSERT for this type of thing. See content/media/wave/nsWaveReader.cpp for example which has:

+  PR_STATIC_ASSERT(PRUint64(BLOCK_SIZE) < UINT_MAX / sizeof(AudioDataValue) / MAX_CHANNELS);
+  const size_t bufferSize = static_cast<size_t>(frames * mChannels);
+  nsAutoArrayPtr<AudioDataValue> sampleBuffer(new AudioDataValue[bufferSize]);

Also use nsAutoArrayPtr here (as the wave example does).

@@ +238,5 @@
> +      frames, data, mInfo.mAudioChannels);
> +
> +  mAudioQueue.Push(audio);
> +
> +  gst_buffer_unref(buffer);

Is it worth creating some safe ref/unref class that unrefs in the destructor for Gst objects?

@@ +244,5 @@
> +  return true;
> +}
> +
> +bool nsGStreamerReader::DecodeVideoFrame(bool &aKeyFrameSkip,
> +                                      PRInt64 aTimeThreshold)

Align PRInt64 with bool in the line above. Ditto with any similar alignment in function arguments later in the file.

@@ +277,5 @@
> +    timestamp = gst_segment_to_stream_time(&mVideoSegment,
> +        GST_FORMAT_TIME, timestamp);
> +    timestamp = nextTimestamp = GST_TIME_AS_USECONDS(timestamp);
> +    if (GST_CLOCK_TIME_IS_VALID(GST_BUFFER_DURATION(buffer))) {
> +        nextTimestamp += GST_TIME_AS_USECONDS(GST_BUFFER_DURATION(buffer));

Is there danger of overflow here? Should it be checked?

@@ +324,5 @@
> +                                   nextTimestamp,
> +                                   b,
> +                                   isKeyframe,
> +                                   -1,
> +                                   mPicture);

Align with 'mInfo' in line 320.

@@ +587,5 @@
> +  GstCaps *caps = gst_pad_get_negotiated_caps(sinkpad);
> +  GstStructure *s = gst_caps_get_structure(caps, 0);
> +  gst_structure_get_int(s, "rate", (gint *) &mInfo.mAudioRate);
> +  gst_structure_get_int(s, "channels", (gint *) &mInfo.mAudioChannels);
> +  mInfo.mHasAudio = true;

Might need to sanity check these values to ensure they're within a valid range (to prevent overflow's, etc later). Ditto with the video parameters later.

::: content/media/gstreamer/nsGStreamerReader.h
@@ +73,5 @@
> +  }
> +
> +private:
> +  static void NewDecodedPadCb(GstElement *aDecodebin, GstPad *aPad,
> +    gboolean last, gpointer aUserData);

s/last/aLast

@@ +103,5 @@
> +  void NewAudioBuffer();
> +
> +  static void EosCb(GstAppSink *aSink, gpointer aUserData);
> +  void Eos(GstAppSink *aSink);
> +

Can you add comments to the functions above with a brief explanation of when/why they're called.

@@ +121,5 @@
> +  GstAppSinkCallbacks mSinkCallbacks;
> +  mozilla::ReentrantMonitor mGstThreadsMonitor;
> +  GstSegment mVideoSegment;
> +  GstSegment mAudioSegment;
> +  bool mReachedEos;

Some comments for what these are used for, threading requirements (if any), etc if possible.

::: js/src/ctypes/libffi/Makefile.am
@@ +82,4 @@
>  
>  MAKEOVERRIDES=
>  
> +ACLOCAL_AMFLAGS= -I m4

Was this included in the patch by mistake?
Attachment #605705 - Flags: review?(chris.double) → review-
I withdraw my rejection in comment 255, and agree with the plan to review and land Alessandro Decina's patch.
Thanks for the review. I'm attaching a new patch that addresses most of the issues you pointed out (and more, see inline comments below). I also started pushing my git repository to https://github.com/alessandrod/mozilla-central which hopefully helps to understand what's changed. 


(In reply to Chris Double (:doublec) from comment #261)
> Comment on attachment 605705 [details] [diff] [review]
> nsBuiltinDecoder* based implementation
> 
> Review of attachment 605705 [details] [diff] [review]:
> -----------------------------------------------------------------
> 
> To build on linux the patch needs some gstreamer header files added to
> config/system-headers:
> 
> diff --git a/config/system-headers b/config/system-headers
> index c0f0221..1486194 100644
> --- a/config/system-headers
> +++ b/config/system-headers
> @@ -1055,3 +1055,7 @@ ogg/os_types.h
>  nestegg/nestegg.h
>  cubeb/cubeb.h
>  #endif
> +gst/gst.h
> +gst/app/gstappsink.h
> +gst/app/gstappsrc.h
> +gst/video/video.h

Done


> The patch looks great. I've made some comments. The 'r-' is mostly minor
> stuff, and because you've got more to do to complete it. I'm definitely
> happy with the direction and approach of the patch though. You'll want to
> add some .mp4 specific files in content/media/tests btw, and make sure tests
> pass.

I started running the mochi tests in content/media/ and fixed a good number of issues already. Not all the tests pass, but most of those that don't pass fail because gstreamer is reporting slightly different durations (the diff is in the order of milliseconds) from what the tests expect.

I'm going to add h264 clips to the test suite now and make sure that all the tests pass.


> This might be challenging if gstreamer is handling the webm/ogg/wave,
> so I'm wondering if it's better than these are handled by the internal code
> even if gstreamer is enabled.

Eventually yeah, it might make sense to enable gst only for h264. I left it enabled for everything for now though so that I can find and fix more issues running all the existing tests. 


> ::: content/media/gstreamer/Makefile.in
> @@ +31,5 @@
> > +# and other provisions required by the GPL or the LGPL. If you do not delete
> > +# the provisions above, a recipient may use your version of this file under
> > +# the terms of any one of the MPL, the GPL or the LGPL.
> > +#
> > +# ***** END LICENSE BLOCK *****
> 
> MPL boilerplate for new files should use the new MPL 2 text. This should be
> done for all new files that you've added. Don't change the boilerplate of
> existing files you're just modifying. http://www.mozilla.org/MPL/headers/

Done


> ::: content/media/gstreamer/nsGStreamerReader.cpp
> @@ +116,5 @@
> > +}
> > +
> > +nsresult nsGStreamerReader::Init(nsBuiltinDecoderReader* aCloneDonor)
> > +{
> > +  gst_init_check(0, 0, NULL);
> 
> Check return value and maybe disable gstreamer support if it fails.

Done
 
> @@ +124,5 @@
> > +        "stream-type", GST_APP_STREAM_TYPE_SEEKABLE,
> > +        "max-bytes", 51200,
> > +        NULL));
> > +  gst_app_src_set_callbacks(mSource, &mSrcCallbacks, (gpointer) this, NULL);
> > +  mDecodebin = gst_element_factory_make("decodebin2", NULL);
> 
> Is it needed to handle the failure case of gst_element_factory_make (and
> similar functions) when they return NULL, or does GStreamer gracefully
> handle the later use of NULL elements?

Needs to be handled. I switched to using playbin2 instead of decodebin2 and added error handling for gst_element_factory_make.


> @@ +224,5 @@
> > +      GST_FORMAT_TIME, timestamp);
> > +  timestamp = GST_TIME_AS_USECONDS(timestamp);
> > +  PRInt64 duration;
> > +  if (GST_CLOCK_TIME_IS_VALID(GST_BUFFER_DURATION(buffer)))
> > +    duration = GST_TIME_AS_USECONDS(GST_BUFFER_DURATION(buffer));
> 
> If this fails, duration will be uninitialized?

Yes, fixed.


> @@ +226,5 @@
> > +  PRInt64 duration;
> > +  if (GST_CLOCK_TIME_IS_VALID(GST_BUFFER_DURATION(buffer)))
> > +    duration = GST_TIME_AS_USECONDS(GST_BUFFER_DURATION(buffer));
> > +  PRInt32 frames = (GST_BUFFER_SIZE(buffer) / 4) / mInfo.mAudioChannels;
> > +  PRInt64 offset = 0;
> 
> Might need to check for mInfo.mAudioChannels being zero here, or when it is
> initially set in the preroll callback later.

The channels={1,2} caps restriction on the audio sink takes care of this. So
either audio is non present, or channels is set to a meaningful value.


> @@ +231,5 @@
> > +  
> > +  LOG(PR_LOG_ERROR, ("decoded audio buffer %"GST_TIME_FORMAT,
> > +        GST_TIME_ARGS(timestamp * GST_USECOND)));
> > +
> > +  AudioDataValue *data = new AudioDataValue[frames * mInfo.mAudioChannels];
> 
> Check against the possiblity of overflow. In particular some gcc versions
> have a bug whereby operator new doesn't check for overflow if the allocation
> count times the size of the object overflows. We use PR_STATIC_ASSERT for
> this type of thing. See content/media/wave/nsWaveReader.cpp for example
> which has:
> 
> +  PR_STATIC_ASSERT(PRUint64(BLOCK_SIZE) < UINT_MAX / sizeof(AudioDataValue)
> / MAX_CHANNELS);
> +  const size_t bufferSize = static_cast<size_t>(frames * mChannels);
> +  nsAutoArrayPtr<AudioDataValue> sampleBuffer(new
> AudioDataValue[bufferSize]);

I can't use PR_STATIC_ASSERT as the size of the buffer is not fixed. I can avoid the multiplication though and did so.

 
> Also use nsAutoArrayPtr here (as the wave example does).

Done

 
> @@ +238,5 @@
> > +      frames, data, mInfo.mAudioChannels);
> > +
> > +  mAudioQueue.Push(audio);
> > +
> > +  gst_buffer_unref(buffer);
> 
> Is it worth creating some safe ref/unref class that unrefs in the destructor
> for Gst objects?

I don't think so. The code that does gst refcounting is self contained and it's done just in an handful of places. Don't have a strong opinion/preference on this though.


 
> @@ +244,5 @@
> > +  return true;
> > +}
> > +
> > +bool nsGStreamerReader::DecodeVideoFrame(bool &aKeyFrameSkip,
> > +                                      PRInt64 aTimeThreshold)
> 
> Align PRInt64 with bool in the line above. Ditto with any similar alignment
> in function arguments later in the file.

Done


> 
> @@ +277,5 @@
> > +    timestamp = gst_segment_to_stream_time(&mVideoSegment,
> > +        GST_FORMAT_TIME, timestamp);
> > +    timestamp = nextTimestamp = GST_TIME_AS_USECONDS(timestamp);
> > +    if (GST_CLOCK_TIME_IS_VALID(GST_BUFFER_DURATION(buffer))) {
> > +        nextTimestamp += GST_TIME_AS_USECONDS(GST_BUFFER_DURATION(buffer));
> 
> Is there danger of overflow here? Should it be checked?

Here timestamp is something that goes (roughly) from zero for the first frame, to the duration of the stream for the last, so in normal circumstances this won't overflow.

It can overflow I guess if GST_BUFFER_DURATION is specifically forged. I'm going to see if I can create a test about this.


> @@ +324,5 @@
> > +                                   nextTimestamp,
> > +                                   b,
> > +                                   isKeyframe,
> > +                                   -1,
> > +                                   mPicture);
> 
> Align with 'mInfo' in line 320.

Done


> 
> @@ +587,5 @@
> > +  GstCaps *caps = gst_pad_get_negotiated_caps(sinkpad);
> > +  GstStructure *s = gst_caps_get_structure(caps, 0);
> > +  gst_structure_get_int(s, "rate", (gint *) &mInfo.mAudioRate);
> > +  gst_structure_get_int(s, "channels", (gint *) &mInfo.mAudioChannels);
> > +  mInfo.mHasAudio = true;
> 
> Might need to sanity check these values to ensure they're within a valid
> range (to prevent overflow's, etc later). Ditto with the video parameters
> later.

Gst already does as we have caps restrictions on the sinks. I added some NS_ASSERTIONs in addition.


> > +  static void EosCb(GstAppSink *aSink, gpointer aUserData);
> > +  void Eos(GstAppSink *aSink);
> > +
> 
> Can you add comments to the functions above with a brief explanation of
> when/why they're called.
> 
> @@ +121,5 @@
> > +  GstAppSinkCallbacks mSinkCallbacks;
> > +  mozilla::ReentrantMonitor mGstThreadsMonitor;
> > +  GstSegment mVideoSegment;
> > +  GstSegment mAudioSegment;
> > +  bool mReachedEos;
> 
> Some comments for what these are used for, threading requirements (if any),
> etc if possible.

Done. If needed, I guess I can add more general comments about how threading synchronization works between gst threads and the decoder state machine threads.


> ::: js/src/ctypes/libffi/Makefile.am
> @@ +82,4 @@
> >  
> >  MAKEOVERRIDES=
> >  
> > +ACLOCAL_AMFLAGS= -I m4
> 
> Was this included in the patch by mistake?

Oops, yes.
Attachment #611716 - Flags: review?(chris.double)
(In reply to Alessandro Decina from comment #263)
> I started running the mochi tests in content/media/ and fixed a good number
> of issues already. Not all the tests pass, but most of those that don't pass
> fail because gstreamer is reporting slightly different durations (the diff
> is in the order of milliseconds) from what the tests expect.

That could be bug 646331.
(In reply to Matthew Gregan [:kinetik] from comment #265)
> (In reply to Alessandro Decina from comment #263)
> > I started running the mochi tests in content/media/ and fixed a good number
> > of issues already. Not all the tests pass, but most of those that don't pass
> > fail because gstreamer is reporting slightly different durations (the diff
> > is in the order of milliseconds) from what the tests expect.
> 
> That could be bug 646331.

You mean when GStreamer is playing the Ogg/WebM files rather than Firefox's built in decoders? If so GStreamer must be returning durations different to the built in decoders, otherwise the built in decoders would be failing the tests too. If we set the durations in manifest.js to the durations the built in decoders find, do the tests still fail?
Attachment #611716 - Attachment is patch: true
(In reply to Alessandro Decina from comment #264)
> Created attachment 611716 [details] [diff] [review]
> New patch addressing 1st review comments + misc fixes. See comment #263.

I get an error when trying to play a video/mp4 with this patch applied:

GLib-GObject-WARNING **: /build/buildd/glib2.0-2.26.1/gobject/gsignal.c:2275: signal `source-setup' is invalid for instance `0x7fa4debfc000

The previous patch worked fine on this video. Is this something to do with the change to playbin?
source-setup was added to playbin2 in 0.10.33. The gst check in configure.in should check for gstreamer core and gst-plugins-base >= 0.10.33. I'll fix this in the next patch.
Comment on attachment 611716 [details] [diff] [review]
New patch addressing 1st review comments + misc fixes. See comment #263.

Review of attachment 611716 [details] [diff] [review]:
-----------------------------------------------------------------

Looks good, just minor comments. r+ with those, and the configure change you mentioned to address the gstreamer version. I don't think it's required that all mochitests pass to land your patch since it's disabled by default - it isn't built without --enable-gstreamer. Followup bugs can be raised to address the failing tests and additional functionality. Additional tests would be good for H.264 encoded files and anything else specific to the backend that you think would be useful.

Have you tried the reftests that compare the result of the decoded video frames with a reference image?

::: content/media/gstreamer/nsGStreamerReader.cpp
@@ +185,2 @@
>  
> +  /* We do 3 attemtps here: decoding audio and video, decoding video only,

Minor: Spelling of 'attempts'.

@@ +322,5 @@
> +  mDecoder->NotifyBytesConsumed(mByteOffset - mLastReportedByteOffset);
> +  mLastReportedByteOffset = mByteOffset;
> +}
> +
> +bool nsGStreamerReader::WaitForDecodedData(int *counter)

Minor: Change 'counter' to 'aCounter'.

@@ +613,5 @@
>  
>  void nsGStreamerReader::NeedData(GstAppSrc *aSrc, guint aLength)
>  {
> +  if (aLength == -1)
> +    aLength = 50 * 1024;

Can you make this magic number (50*1024) a define/constant somewhere.
Attachment #611716 - Flags: review?(chris.double) → review+
AFAIU, this code is yet to be merged, right ?
In such case, did anyone checked this code for required changes in regard of gstreamer 1.0 candidates ?
Just so, that if gstreamer upstream releases before you do, it could be used with either 0.10 or 1.0, depending on build options.
can you update your github repo with the new patch?
New patch checking for gst >=0.10.33 and dealing with the
comments in your last review. The patch is not incremental but
contains all the changes so far, which should ease merging.

I pushed everything to my github repo so you can look at the individual
commits here https://github.com/alessandrod/mozilla-central/commits/bugs/422540

I also ran the reftests as you suggested. These are the results for webm:

REFTEST INFO | Result summary:
REFTEST INFO | Successful: 26 (26 pass, 0 load only)
REFTEST INFO | Unexpected: 2 (2 unexpected fail, 0 unexpected pass, 0 unexpected asserts, 0 unexpected fixed asserts, 0 failed load, 0 exception)
REFTEST INFO | Known problems: 5 (0 known fail, 0 known asserts, 5 random, 0 skipped, 0 slow)


These are the results for ogg:

REFTEST INFO | Result summary:
REFTEST INFO | Successful: 27 (27 pass, 0 load only)
REFTEST INFO | Unexpected: 0 (0 unexpected fail, 0 unexpected pass, 0 unexpected asserts, 0 unexpected fixed asserts, 0 failed load, 0 exception)
REFTEST INFO | Known problems: 5 (0 known fail, 0 known asserts, 5 random, 0 skipped, 0 slow)
REFTEST INFO | Total canvas count = 4

Now i'm going to check what those two webm failures are about and
i'm going to add H264 specific mochitests.

I agree that things are good enough to be merged now. I flagged the
patch for checkin, do I need to do anything else?
Attachment #605705 - Attachment is obsolete: true
Attachment #611716 - Attachment is obsolete: true
Attachment #615622 - Flags: checkin?(chris.double)
Great, waiting for someone with commit access to merge then \o/
Comment on attachment 615622 [details] [diff] [review]
GStreamer backend for audio/video decoding (including all the post review fixes)

Chris Double is on vacation. Flagging to land in his stead.
Attachment #615622 - Flags: checkin?(chris.double) → checkin+
(In reply to Michael Monreal [:monreal] from comment #275)
> As GSteamer 1.0 is just around the corner...
> 
> http://gstreamer.freedesktop.org/wiki/ZeroPointEleven
> http://cgit.freedesktop.org/gstreamer/gstreamer/tree/docs/random/porting-to-
> 0.11.txt

Hehe, yeah that's on the todo list already :)
Attachment #430270 - Attachment is obsolete: true
Attachment #555879 - Attachment is obsolete: true
Attachment #615622 - Flags: approval-mozilla-central?
Comment on attachment 615622 [details] [diff] [review]
GStreamer backend for audio/video decoding (including all the post review fixes)

Ralph, just use the checkin-needed flag. The checkin? flag is mainly used when landing on multiple branches or landing only a subset of patches. Also, obsoleting old patches is appreciated. Also, with mozilla-central/inbound in lockdown mode currently, this will need approval before landing.
Attachment #615622 - Flags: checkin+
Ok, thanks. I just wanted to be clear which patch we intended to land.
Try run for 1e0e98d406ac is complete.
Detailed breakdown of the results available here:
    https://tbpl.mozilla.org/?tree=Try&rev=1e0e98d406ac
Results (out of 230 total builds):
    success: 196
    warnings: 34
Builds (or logs if builds failed) available at:
http://ftp.mozilla.org/pub/mozilla.org/firefox/try-builds/rgiles@mozilla.com-1e0e98d406ac
This is NPOTB (not part of the default build) and should land.

Try push at https://tbpl.mozilla.org/?tree=Try&rev=1e0e98d406ac is clean; it shows two Mochitest orange, both of which are known in, and a bunch of restarts after infrastructure or timeout issues.
Comment on attachment 615622 [details] [diff] [review]
GStreamer backend for audio/video decoding (including all the post review fixes)

[Triage Comment]
We've now given NPOTB patches blanket approval - a=npotb. https://wiki.mozilla.org/Tree_Rules#mozilla-central_.28Nightly_channel.29
Attachment #615622 - Flags: approval-mozilla-central? → approval-mozilla-central+
https://hg.mozilla.org/integration/mozilla-inbound/rev/c04a467c48ac
Assignee: kishore.arepalli → alessandro.d
Flags: in-testsuite-
Keywords: checkin-needed
Target Milestone: --- → mozilla14
https://hg.mozilla.org/mozilla-central/rev/c04a467c48ac
Status: ASSIGNED → RESOLVED
Closed: 12 years ago
Resolution: --- → FIXED
Keywords field has "mobile" but platform is "all."  So, is this still fennec only or should this work with Firefox, including on Windows?
Should work with Firefox on any platform, including Windows.
(In reply to IU from comment #285)
> Keywords field has "mobile" but platform is "all."  So, is this still fennec
> only or should this work with Firefox, including on Windows?

See https://bugzilla.mozilla.org/describekeywords.cgi
The "mobile" keyword means "important to mobile developmental work", not mobile-only.
Blocks: 747257
Thank you for your excellent patch. Do you plan to add support of standalone mp3 audio (in the <audio> tag)?
Nice to see this issue got resolved. Thanks to everyone who worked on this.

Does Firefox 12 already has GStreamer support? Or will GStreamer support come with a further Firefox version? Like Firefox 13 or 14?

Thanks.
The patch landed on the Firefox 14 branch. (See the 'Target Milestone' field at the top of the  bug). Firefox 14 will become the stable release in July.

Note however that which GStreamer support is in the codebase, Official builds do not have it turned on. At least, not yet.
(In reply to Ralph Giles (:rillian) from comment #290)
> The patch landed on the Firefox 14 branch. (See the 'Target Milestone' field
> at the top of the  bug). Firefox 14 will become the stable release in July.
> 
> Note however that which GStreamer support is in the codebase, Official
> builds do not have it turned on. At least, not yet.

Thanks a lot.
Could someone, please, compile a build for Windows with this special flag needed to include GStreamer support?
(In reply to Ralph Giles (:rillian) from comment #290)
> Note however that which GStreamer support is in the codebase, Official
> builds do not have it turned on. At least, not yet.

Is there a bug for turning it on? If not, should we file one? What are the current plans for this feature?

(In reply to Sean Newman from comment #292)
> Could someone, please, compile a build for Windows with this special flag
> needed to include GStreamer support?

Seconded.
(In reply to Sid from comment #293)

> Is there a bug for turning it on? If not, should we file one? What are the
> current plans for this feature?

If you think it should be on by default you should certainly file a bug so it can be discussed. As far as I know current plans aren't settled; the decision to change policy on the official builds depends on the codec issues, and we don't have consensus or working code there yet.

> (In reply to Sean Newman from comment #292)
> > Could someone, please, compile a build for Windows with this special flag
> > needed to include GStreamer support?

Note that for this to be useful, we'd have to build and include a copy of Gstreamer as well. It's generally available on Linux systems, but not Windows, Mac or Android. That's probably a separate bug as well: getting gstreamer building on non-linux archs either as part of our monolithic build, or by bundling a standalone  gstreamer + plugins build, such as the sdk from entropywave, or the one fluendo announced a few months ago.
(In reply to Ralph Giles (:rillian) from comment #294)
> As far as I know current plans aren't settled; the
> decision to change policy on the official builds depends on the codec
> issues, and we don't have consensus or working code there yet.

Yeah, I forgot about codec problems. Let's wait then. Anyway, if those issues are solved, it would be better to try and implement a native DirectShow or Media Foundation backend for Windows (bug 435339).

> It's generally available on Linux systems, but not Windows, Mac or Android.

There is a Windows port of GStreamer available on http://ossbuild.googlecode.com (although it seems abandoned at v0.10.7). Will it work with GStreamer-enabled Nightly build?
(In reply to Sid from comment #295)
> (In reply to Ralph Giles (:rillian) from comment #294)
> > As far as I know current plans aren't settled; the
> > decision to change policy on the official builds depends on the codec
> > issues, and we don't have consensus or working code there yet.
> 
> Yeah, I forgot about codec problems. Let's wait then. Anyway, if those
> issues are solved, it would be better to try and implement a native
> DirectShow or Media Foundation backend for Windows (bug 435339).

Why do you think that would be better?
 

> > It's generally available on Linux systems, but not Windows, Mac or Android.
> 
> There is a Windows port of GStreamer available on
> http://ossbuild.googlecode.com (although it seems abandoned at v0.10.7).
> Will it work with GStreamer-enabled Nightly build?

GStreamer has pretty good Windows support and it even supports some of the native codecs. Building it and shipping it on Windows is not as easy as it could be though. To fix that, as Ralph mentioned, there's an undergoing effort to produce an easy to use (and ship and build) GStreamer SDK for linux, mac and windows, see http://is.gd/tteunv. 

I'm waiting for the SDK to be ready (gonna be ready Any Day Now ;)), then my plan is to integrate it in the build and start a discussion on how and where to enable the gstreamer backend by default.
(In reply to Alessandro Decina from comment #296)

> I'm waiting for the SDK to be ready (gonna be ready Any Day Now ;)), then my
> plan is to integrate it in the build and start a discussion on how and where
> to enable the gstreamer backend by default.

Great! Can you open a bug for that? The discussion thread here is long enough already. :)
No longer blocks: 747257
Blocks: 747257
(In reply to Alessandro Decina from comment #296)

> I'm waiting for the SDK to be ready (gonna be ready Any Day Now ;)), then my
> plan is to integrate it in the build and start a discussion on how and where
> to enable the gstreamer backend by default.

Just in case, SDK is out.
http://gstreamer.com/
(In reply to Sid from comment #298)
> Just in case, SDK is out.
> http://gstreamer.com/

In the Windows installation instructions they recommend removing the client app's dependency on MSVC2010's runtime DLL and using the "“basic” C runtime which comes in every Windows system since Windows XP, and is named MSVCRT.DLL".

See:

"Removing the dependency with the Visual Studio runtime"
http://docs.gstreamer.com/display/GstSDK/Installing+on+Windows

We apparently ship with the MSVC 2010's RT, and switching runtimes makes me nervous... Were we to ship this on Windows, we may be best to build the SDK ourselves with MSVCRT2010 or get Collabora to build one with MSVCRT2010.
There are several reasons why we decided to build the SDK against the system CRT msvcrt.dll, but the most important one is that if you decide to link against any of the VS ones you will be forced to distribute broken software.

According to MS EULA you can't distribute yourself this "system library" (eg: msvcr100.dll) so your software is depending on a third party installer (Microsoft Visual C++ 20XX Redistributable Package). The GPL also forbids explicitely the redistribution of System Libraries (http://www.gnu.org/licenses/gpl-faq.en.html#WindowsRuntimeAndGPL).

How is this handled in Firefox?

There is the option of rebuilding the SDK linking against msvcr100.dll, this would require on our side providing a gcc spec that links against moldnames100 and msvcr100 and rebuilding gcc so that libgcc_s_sjlj-1.dll and libstdc++6.dll are linked against the new CRT and using the gcc spec in the toolchain.
We ship the CRT files alongside our app. I don't think your reading of the EULA is correct. Those files are explicitly listed as "redistributable", and historically you have been allowed to ship the DLL files with your application.

The GPL wrinkle is tricky, that sounds like a big PITA. In any event, we ship Firefox under the MPL, so it's not an issue for us.
(In reply to Ted Mielczarek [:ted] from comment #301)
> We ship the CRT files alongside our app. I don't think your reading of the
> EULA is correct. Those files are explicitly listed as "redistributable", and
> historically you have been allowed to ship the DLL files with your
> application.

Apparently I was hitten by this bug when I last read the EULA: http://archive.msdn.microsoft.com/KB956414. But indeed it's very clear here that you are allowed to redistribute them even without the installer (http://msdn.microsoft.com/en-us/library/ms235299.aspx)
Can we introduce GStreamer to Mozilla source tree?

Because on Windows, Gstreamer need rebuilding using the user's compiler.
Blocks: 776838
(In reply to xunxun from comment #303)

> Can we introduce GStreamer to Mozilla source tree?

If you think that would be useful please open a separate bug for discussion. Especially if you can provide patches.

A simpler approach in the near term might be to add support for linking against one of the pre-built SDKs, e.g. from entropywave or fluendo. IIRC fluendo's had a problem with mismatched runtimes on windows, though.

> 
> Because on Windows, Gstreamer need rebuilding using the user's compiler.
Are there plans to enable this in Linux builds?  Should that be a separate Mozilla bug?
There are no plans to enable this in official builds. If you think there should be, please open a new bug for that, and mark it dependent on this one.
(In reply to Ralph Giles (:rillian) from comment #304)
> (In reply to xunxun from comment #303)
> 
> > Can we introduce GStreamer to Mozilla source tree?
> 
> If you think that would be useful please open a separate bug for discussion.
> Especially if you can provide patches.
> 
> A simpler approach in the near term might be to add support for linking
> against one of the pre-built SDKs, e.g. from entropywave or fluendo. IIRC
> fluendo's had a problem with mismatched runtimes on windows, though.
> 
> > 
> > Because on Windows, Gstreamer need rebuilding using the user's compiler.

I found that Opera had a Gstreamer project for windows Clone : http://sourcecode.opera.com/gstreamer/
Depends on: 817015
Depends on: 894372
Blocks: 894372
No longer depends on: 894372
You need to log in before you can comment on or make changes to this bug.