Closed
Bug 422540
Opened 17 years ago
Closed 13 years ago
GStreamer backend for HTML5 video element
Categories
(Core :: Audio/Video, enhancement, P1)
Core
Audio/Video
Tracking
()
RESOLVED
FIXED
mozilla14
Tracking | Status | |
---|---|---|
fennec | 1.0- | --- |
People
(Reporter: cajbir, Assigned: alessandro.d)
References
Details
(Keywords: mobile)
Attachments
(6 files, 57 obsolete files)
14.83 KB,
text/plain
|
Details | |
83.21 KB,
text/plain
|
Details | |
112.26 KB,
text/plain
|
Details | |
11.11 KB,
text/plain
|
Details | |
126.11 KB,
image/png
|
Details | |
48.41 KB,
patch
|
akeybl
:
approval-mozilla-central+
|
Details | Diff | Splinter Review |
Bug 382267 is being restructured to not contain implementations of decoders for video codecs and to allow different backend decoder implementations. Implement a backend that uses GStreamer to decode audio/video, allowing support of all the video formats that the user has GStreamer plugins for.
Reporter | ||
Comment 1•17 years ago
|
||
You'll need to apply the patch from 382267 first. This applies over the top of that. Once applied <video> will work with any video format that gstreamer has plugins for. Tested with standard Ogg streamer plugin and fluendo h.264 plugin. Seeking works via changing the currentTime property of the video object but there is no standard controls or progress events for it yet.
Reporter | ||
Comment 2•17 years ago
|
||
Updated to work with latest bug 382267 patch
Attachment #313256 -
Attachment is obsolete: true
Comment 3•17 years ago
|
||
marking wanted1.9.1? to get this in the triage queue. If this needs to be blocking1.9.1?, please mark it as so.
Flags: wanted1.9.1?
Priority: -- → P1
Flags: wanted1.9.1? → wanted1.9.1+
Reporter | ||
Comment 4•16 years ago
|
||
Updated to work with changes in bug 382267
Attachment #322882 -
Attachment is obsolete: true
Hi, Double It is really interesting work! I applied this version3 patch on windows οΌbuilt it in cygwin, then installed corresponding gstream component, it works! thanks for your hard work! but there is a big performence problem on win32 platform, I found the cpu load is very heavy even when I play a very low-density video, and I have no idea about performance on linux. I read and tested the code and found it is frame->Invalidate(r, PR_FALSE) call that consume most cpu, I wander if there is some skill can improve the performance ? I am not really familiar with the mozilla's archetecture and just start to learn it. so can you give me some clue about the performance? Thanks!
Comment 6•16 years ago
|
||
Any plans on committing/enabling for Linux nightlies now that bug 382267 patches are in? Audio for me right now is very messed up, and I suspect this patch will sort those issues out. Thanks!
Updated•16 years ago
|
Component: DOM: Core & HTML → Video/Audio
QA Contact: general → video.audio
Reporter | ||
Comment 7•16 years ago
|
||
This updates to trunk. Since trunk now has the Ogg decoder some thought needs to go into how to support multiple backends. I punt on this issue for this patch and always instantiate gstreamer for <video src="...">, but instantiate Ogg if there is a <source> with a type of application/ogg, etc.
Attachment #327910 -
Attachment is obsolete: true
Comment 9•16 years ago
|
||
patch fails to build here with nsHTMLMediaElement.cpp: In member function βnsresult nsHTMLMediaElement::PickMediaElement(nsAString_internal&)β: nsHTMLMediaElement.cpp:599: error: cannot allocate an object of abstract type βnsGStreamerDecoderβ ../../../../dist/include/content/nsGStreamerDecoder.h:65: note: because the following virtual functions are pure within βnsGStreamerDecoderβ: ../../../../dist/include/content/nsVideoDecoder.h:77: note: virtual nsIPrincipal* nsVideoDecoder::GetCurrentPrincipal() nsHTMLMediaElement.cpp:639: error: cannot allocate an object of abstract type βnsGStreamerDecoderβ ../../../../dist/include/content/nsGStreamerDecoder.h:65: note: since type βnsGStreamerDecoderβ has pure virtual functions make[6]: *** [nsHTMLMediaElement.o] Error 1
Reporter | ||
Comment 10•16 years ago
|
||
Update to build and run on trunk. This patch needs some work, which it'll be getting post 3.1.
Attachment #336911 -
Attachment is obsolete: true
Updated•16 years ago
|
tracking-fennec: --- → 1.0+
Comment 11•16 years ago
|
||
Plz. contact me if you will look at this, there are a number of issues that you should be aware of - it is "work in progress"
Comment 12•16 years ago
|
||
Same comment as for the previous one, plz. contact me if you want to have a look at this, there are several issues :)
Attachment #376425 -
Attachment is obsolete: true
Comment 14•16 years ago
|
||
(In reply to comment #13) > Is there a working try server build? Sorry, no try server build. The status of the change is that it has value from a source code perspective, not as code that actually runs in a real device.
Flags: wanted1.9.1+
Updated•16 years ago
|
Assignee: chris.double → mkristoffersen
Updated•16 years ago
|
Attachment #376428 -
Attachment is patch: true
Comment 15•16 years ago
|
||
If you actually try to watch a video with this code, your browser might hang
Comment 16•16 years ago
|
||
Comment 17•16 years ago
|
||
Comment 18•16 years ago
|
||
This is still mostly interesting from a source code point of view - The code doesn't do much good in the target.
Attachment #345597 -
Attachment is obsolete: true
Attachment #376428 -
Attachment is obsolete: true
Attachment #380178 -
Attachment is obsolete: true
Comment 19•16 years ago
|
||
(In reply to comment #18) > Created an attachment (id=381129) [details] > Dump where audio and video is played back when running fennec on the pc > (doesn't work in target) > > This is still mostly interesting from a source code point of view - The code > doesn't do much good in the target. Oh yes, and still only expect a single video to be played, between each launch of fennec. And... add "ac_add_options --enable-gstreamer" to your mozconfig in order to enable the code - thanks to blassey for providing the build system fix :)
Comment 20•16 years ago
|
||
This version has partial support for playback in target, the usual stuff (no audio in target version, only one playback pr browser launch etc.) applies
Updated•16 years ago
|
Attachment #381476 -
Attachment is patch: true
Attachment #381476 -
Attachment mime type: application/octet-stream → text/plain
Comment 21•16 years ago
|
||
The usual stuff applies - e.g. no playback in target, hard-coded to start playback - work in progress
Attachment #381129 -
Attachment is obsolete: true
Attachment #381476 -
Attachment is obsolete: true
Comment 22•16 years ago
|
||
Updated patch so it's easier to apply. Thanks for removing bitrot Mike.
Attachment #382679 -
Attachment is obsolete: true
Comment 23•16 years ago
|
||
Fubar'd the last patch this should apply everything needed now, and works against trunk.
Attachment #382994 -
Attachment is obsolete: true
Updated•16 years ago
|
Attachment #383183 -
Attachment is patch: true
Attachment #383183 -
Attachment mime type: application/octet-stream → text/plain
Comment 24•15 years ago
|
||
This patch is functionally like the others, except an extra buffer has been added to the audio stream. The stream input and input buffer has been collected in a bin. The audio and video sinks moved to their own individual bins. (will now be more easy to add volume control element etc.) Code has been split up into several individual classes to improve sanity and structure - making it less scary to fiddle with the individual pieces. Work in progress still need to eliminate about 7 functions and move about 16 before the refactoring is complete. Unless you have a need to actually modify the gstreamer integration code, please continue to use patch 383183 - I'll post an updated version to replace that patch ones the refactoring is complete and I have moved to TOT.
Updated•15 years ago
|
Attachment #383996 -
Attachment is patch: true
Attachment #383996 -
Attachment mime type: application/octet-stream → text/plain
Comment 25•15 years ago
|
||
Added buffer for video stream, fixed ref-count issue for GStreamer elements
Attachment #383183 -
Attachment is obsolete: true
Attachment #383996 -
Attachment is obsolete: true
Comment 26•15 years ago
|
||
Forgot to add the new files
Attachment #384400 -
Attachment is obsolete: true
Comment 28•15 years ago
|
||
applying the latest patch I get: applying diff_nsMediaStream unable to find 'content/media/video/public/Makefile.in' for patching 1 out of 1 hunks FAILED -- saving rejects to file content/media/video/public/Makefile.in.rej unable to find 'content/media/video/public/nsMediaDecoder.h' for patching 1 out of 1 hunks FAILED -- saving rejects to file content/media/video/public/nsMediaDecoder.h.rej unable to find 'content/media/video/src/Makefile.in' for patching 1 out of 1 hunks FAILED -- saving rejects to file content/media/video/src/Makefile.in.rej patch failed, unable to continue (try -v) content/media/video/public/Makefile.in: No such file or directory content/media/video/public/nsMediaDecoder.h: No such file or directory content/media/video/src/Makefile.in: No such file or directory content/media/video/src/Makefile.in not tracked! content/media/video/public/nsMediaDecoder.h not tracked! content/media/video/public/Makefile.in not tracked! patch failed, rejects left in working dir
Reporter | ||
Comment 29•15 years ago
|
||
The content/media directory was restructured in bug 499880. This landed on trunk yesterday. The files have changed directory.
Comment 30•15 years ago
|
||
I'm testing a patch now to update to tip. I have created a gstreamer directory at content/media/gstreamer, it seemed to fit with the scheme you guys are adopting. Let me know if that's utter non-sense and I'll refactor. Otherwise I'll update with a patch when I know it's building/running right
Comment 31•15 years ago
|
||
I have updated the patch. Check my previous comment regarding directory structure. Not sure if I made the right choice here. Otherwise this builds.
Attachment #384974 -
Attachment is obsolete: true
Directory structure looks good.
Comment 34•15 years ago
|
||
Fits the same changelist as the previous patch, it hasn't been tested on anything but a single mpeg, but is never the less an important step :) (and btw skipping frames even the algorithm isn't quite right yet) I have noticed a few times that the audio might start to play back for 0..n sec before the first frame of the video is shown... and btw give it time, it can look stuck at times, and don't try with anything but mpegs version 1 or 2 (or you will be stuck forever and ever and....)
Attachment #385894 -
Attachment is obsolete: true
Comment 37•15 years ago
|
||
Note that audio is down (again) on the N810 with this patch
Attachment #388236 -
Attachment is obsolete: true
Updated•15 years ago
|
Attachment #390515 -
Attachment is patch: true
Attachment #390515 -
Attachment mime type: application/octet-stream → text/plain
Attachment #390515 -
Flags: review?(chris.double)
Comment 39•15 years ago
|
||
NOTE: The review requested should be seen as a preliminary one, Chris has received an e-mail with the details, (known issues and specific questions)
Comment 40•15 years ago
|
||
This patch is not compliant with Mozilla coding standards - especially the GStreamer sink element is still close to its original non-Mozilla form - I would like to keep it this way for a little to limit the merging task for me and the GStreamer guy who created the original element - when the element is stable I'll convert it to Mozilla style - might even consider removing the goto's in there... I didn't mark this patch as a obsoleting the previous one, as the previous one is significantly more stable than this one - This patch earns its right as it has the correct video sink element and has significantly higher framerate on the N810.
Comment 42•15 years ago
|
||
A lot of mode diffs in this diff - if anyone knows how to get rid of them, let me know :)
Attachment #390515 -
Attachment is obsolete: true
Attachment #394590 -
Attachment is obsolete: true
Attachment #390515 -
Flags: review?(chris.double)
Comment 43•15 years ago
|
||
The patch is working as expected on N810 there is something fundamental wrong on the N900 (as in "video doesn't work", at least on my device)
Attachment #400466 -
Attachment is obsolete: true
Updated•15 years ago
|
Attachment #403317 -
Flags: review?(chris.double)
Comment 44•15 years ago
|
||
Comment on attachment 403317 [details] [diff] [review] Fits changeset: 33238:e915fafc9ba5 (Morning of Sep 28) Not expected to pass review, as agreed, review requested to get feedback on what needs to be fixed before review can be approved for initial submit.
Comment 45•15 years ago
|
||
Attachment #403317 -
Attachment is obsolete: true
Attachment #404113 -
Flags: review?
Attachment #403317 -
Flags: review?(chris.double)
Updated•15 years ago
|
Attachment #404113 -
Flags: review? → review?(chris.double)
Comment 46•15 years ago
|
||
(In reply to comment #45) > Created an attachment (id=404113) [details] > Suspend/resume partly working btw - sorry for the mode changes in the patch, they shouldn't be there
Comment 47•15 years ago
|
||
Fits change: 33827:f1644b493ff5 Still has an issue resuming a specific video, where the audio playback is failing during normal playback when I test on PC.
Attachment #404113 -
Attachment is obsolete: true
Attachment #404113 -
Flags: review?(chris.double)
Updated•15 years ago
|
Attachment #406209 -
Flags: review?(chris.double)
Reporter | ||
Comment 48•15 years ago
|
||
The patch in attachment 406209 [details] [diff] [review] fails to apply.
Reporter | ||
Comment 49•15 years ago
|
||
There seem to be a bunch of whitespace only changes in files like nsCanvasRenderingContext2D.cpp. Can you generate a patch without these whitespace diffs to make reviewing easier?
Reporter | ||
Comment 50•15 years ago
|
||
Comment on attachment 406209 [details] [diff] [review] Now with suspend-resume working for suported videos // Part 1 + + // Check if any decoders knows about the mime type Fix grammar + // We have not found a decoder yet, so make sure we don't think so This comment isn't very well worded. Can you change it? diff --git a/content/media/gstreamer/Makefile.in b/content/media/gstreamer/Makefile.in +++ b/content/media/gstreamer/Makefile.in +# Contributor(s): +# Chris Double <chris.double@double.co.nz> Add your name as a contributor. +PRBool nsGStreamerAudioBin::Create() +{ ... + + mAudioBin = GST_BIN_CAST(gst_bin_new("mAudioBin")); + This can fail and mAudioBin is later used a few lines down in gst_object_sink. Detect and handle the failure here. Same with the following gst_element_factory_make calls. + // Create a buffer for the audio => Running in separate thread What does the 'running in separate thread' mean? How does the call to gst_element_factory_make cause it to run in a different thread? + // Find the best possible sink + if (!mAudioSink) { + mAudioSink = gst_element_factory_make("gconfaudiosink", "mAudioSink"); + } + if (!mAudioSink) { + mAudioSink = gst_element_factory_make("pulsesink", "mAudioSink"); + } ...etc... Can we make these sinks, and the order they're tried, a pref? Maybe a comma separated list of sink names in the order they should be tried. + // Now stuff gets a little complicated - cause on the N810 our only choice for + // audio sink is the "autoaudiosink" which we at this time can't add to + // our bin, as we need to set it's filter property before we enter the pause + // state. "its" not "it's" + // + // So that's the reason we in the below code treat mAudioSink a litle special little + // Take ownership of the elements + gst_object_sink(gst_object_ref(GST_OBJECT(mAudioBin))); + gst_object_sink(gst_object_ref(GST_OBJECT(mAudioBuffer))); + gst_object_sink(gst_object_ref(GST_OBJECT(mAudioConvert))); + gst_object_sink(gst_object_ref(GST_OBJECT(mAudioSink))); + gst_object_sink(gst_object_ref(GST_OBJECT(mAudioVolume))); Can we take ownership of these in the same place that we create them? + guint64 max_buffer_time_in_ns = 10; // 10 seconds + max_buffer_time_in_ns *= 1000; + max_buffer_time_in_ns *= 1000; + max_buffer_time_in_ns *= 1000; + guint64 min_threshold_in_ns = 2; // 2 seconds +// min_threshold_in_ns *= 1000; +// min_threshold_in_ns *= 1000; +// min_threshold_in_ns *= 1000; Why are these 3 lines for min_threshold_in_ns commented out, but the max_buffer_time_in_ns is not? + // TODO: Do error handling / cleanup Needs to be done for review... +void nsGStreamerAudioBin::Destroy() +{ + if (mAudioBin) { + gst_object_unref(GST_OBJECT(mAudioBin)); + } + if (mAudioSink) { + gst_object_unref(GST_OBJECT(mAudioSink)); + } + if (mAudioConvert) { + gst_object_unref(GST_OBJECT(mAudioConvert)); + } + if (mAudioBuffer) { + gst_object_unref(GST_OBJECT(mAudioBuffer)); + } + if (mAudioVolume) { + gst_object_unref(GST_OBJECT(mAudioVolume)); + } gst_object_unref is safe to be called with a null object you don't need the tests. + // Try to move the audio sink into the ready state + gst_element_set_state(mAudioSink, GST_STATE_PAUSED); Handle or log failure of the state fails to be set. +//////////////////////////////////////////////////////////////////////////////// +// Class that takes care of the audio bin + Add comments describing what components the audio bin is composed of and how they are linked together.
Reporter | ||
Comment 51•15 years ago
|
||
+#ifndef nsGStreamerAudioPropertiesInterface_h__ +#define nsGStreamerAudioPropertiesInterface_h__ + +//////////////////////////////////////////////////////////////////////////////// +// Interface class to modify audio properties + Describe in more detail what this class is used for. Who should inherit from it? Who calls it? Does anyone maintain a reference to it - if so, who controls ownership of it. Will an instance of this class ever have delete called on it from an nsGStreamerAudioPropertiesInterface pointer? If so, does it need a virtual destructor? +#ifndef nsGStreamerConfig_h__ ... +// Define the max size of video elements in pixels +#define MAX_VIDEO_WIDTH 1024 +#define MAX_VIDEO_HEIGHT 1024 Where did these numbers come from? Does this mean 1080P HD video won't play? +// Define how much data we should try to read at a time +#define GST_BUFFER_READ_CHUNK_SIZE 1024*16 What's the origin of this size? Maybe comment about why you chose that value. From nsGStreamerDecodeBin.cpp: +nsGStreamerDecodeBin::nsGStreamerDecodeBin() + : mDecodeBin(nsnull) +{ +} Why is the other member variable, mPadDest, not set to an initial value? +PRBool nsGStreamerDecodeBin::Create(nsGStreamerDecodeBinPadInterface *aPadDest) +{ ... + // If we got the decode bin, then configure it + if (mDecodeBin) { + // The decodebin won't make any output pads, until it knows something about + // it's input data since it only at that time can determine the kind of + // output, if any, it can generate "its" not "it's" + // Constructor + // Destructor Comments like this aren't really needed. It's obvious when it's a constructor or a destructor.
Reporter | ||
Comment 52•15 years ago
|
||
// Part 3 +++ b/content/media/gstreamer/nsGStreamerDecoder.h ... + * Contributor(s): + * Chris Double <chris.double@double.co.nz> Add your name as a contributor. +class nsGStreamerDecoder : public nsMediaDecoder ... + // Function to create "new" an instance This comment needs rephrasing to be more readable. + // It's done in this way to allow GStreamer data types in the "real" class + static nsGStreamerDecoder *makeNewInstance(); I'm not sure what you mean by "to allow GStreamer data types in the 'real' class". Can you expand on this? + +//////////////////////////////////////////////////////////////////////////////// +// Class that takes care of communication with the media element Go into more detail about why this class exists. I presume it's something to do with thread safety? + // Constructor + // Virtual destructor Comments like this aren't needed. Add comments to the methods to explain what threads they can safely be called on. Add assertions to the methods that are main thread only to ensure they are called on the main thread only. eg: NS_ASSERTION(NS_IsMainThread(), "Should be called on main thread"); + nsHTMLMediaElement* mElement; Add a comment about what threads can use mElement (ie. main thread only?). This is not a counted pointer - how is ownership of mElement maintained? For an example of the type of comment needed, see the 'mElement' comment in nsMediaDecoder.h which describes the thread usage and why it's not reference counted.
Reporter | ||
Comment 53•15 years ago
|
||
+++ b/content/media/gstreamer/nsGStreamerInputBin.cpp @@ -0,0 +1,268 @@ +nsGStreamerInputBin::nsGStreamerInputBin() + : mInputBin(nsnull), + mStreamSrc(nsnull), + mInputBuffer(nsnull), + mElementWrapper(nsnull) +{ +} mBufStatusInterface is missing from the list of things set to nsnull. +PRBool nsGStreamerInputBin::Create(nsMediaStream *aMediaStream, + nsGStreamerInputBufferStatusInterface* aBufStatusInterface) +{ Everything I've said about previous ::Create implementations should be addressed here too. + && mStreamSrc + && mStreamSrc mStreamSrc listed twice. + const guint max_buffer_buffers = 0; + const guint max_buffer_size_in_bytes = 16*1024; // 512*1024 + const guint64 max_buffer_time_in_ns = 0; + const guint min_threshold_in_bytes = 0; // 512*1024 + const guint64 min_threshold_in_ns = 0; + const guint min_threshold_buffers = 0; How did you decide on what numbers to use here? + gpointer gvStatusInterface = (nsGStreamerStreamStatusInterface *)this; Use C++ casts. diff --git a/content/media/gstreamer/nsGStreamerInputBin.h b/content/media/gstreamer/nsGStreamerInputBin.h + +//////////////////////////////////////////////////////////////////////////////// +// Class that takes care of the input buffer + Expand on this. Where is it used, what for. How does it fit into the overall architecture of the GStreamer decoder. + // Function to create an inputbuffer + // aChannel [in]: channel for incoming data + // aStreamListener [out]: pointer to IStreamListener + PRBool Create(nsMediaStream *aMediaStream, + nsGStreamerInputBufferStatusInterface* aBufStatusInterface); The comment makes no sense. Where is 'aChannel' and 'aStreamListener'? + + nsGStreamerInputBufferStatusInterface* mBufStatusInterface; Needs a comment to say what this is for, who owns it, what threads it can be used on, etc. + //Takes care of element com. + nsRefPtr<nsGStreamerElementWrapper> mElementWrapper; What does this mean? What is 'com'? + // Called to inform that EOF has been reached + virtual void EOFReached(); Called by whom? +++ b/content/media/gstreamer/nsGStreamerInputBufferStatusInterface.h + + // Called to inform state machine of buffer status What state machine? +++ b/content/media/gstreamer/nsGStreamerLib.cpp @@ -0,0 +1,243 @@ + +void nsGStreamerLib::BringGStreamerDown() +{ + switch (mGStreamerLibStatus) Reading mGStreamerLibStatus without taking the mGStreamerMutex lock. The switch also reads mNumberOfInstances. + // The gstreamer lib is now down + mGStreamerLibStatus = GSTREAM_GDOWN_LIB_DOWN; This should be protected by the lock? +//////////////////////////////////////////////////////////////////////////////// +// Class that handels the GStreamer library, +// registration of plugins etc. MT safe This class needs more comments to say which methods must have the lock obtained before calling them. For example, the implementation obtains the lock then calls BringGStreamerDown(). The coment for BringGStreamerDown should say that it must be called with the lock obtained. Ideally the implementation would assert this fact. + + // The last thing we should do, befor termination of the program + // is to terminate the GStreamer library + class Terminator + { + public: + ~Terminator(); + }; + + static Terminator terminator; Why not do this in the destructor of nsGStreamerLib instead of having the helper object? +++ b/content/media/gstreamer/nsGStreamerLink.cpp + // Only shut it down ones "once" not "ones" +//////////////////////////////////////////////////////////////////////////////// +// Class that links everything GStreamer related together + +class nsGStreamerLink : public nsGStreamerDecoder Why is this a derived class of nsGStreamerDecoder? Why not put all this in nsGStreamerDecoder itself? Nothing else derives from nsGStreamerDecoder? Please describe what members and methods are main thread only, and what can be called on any thread. Describe thread safety requirements.
Reporter | ||
Comment 54•15 years ago
|
||
+++ b/content/media/gstreamer/nsGStreamerPipeline.cpp + +nsGStreamerPipeline::nsGStreamerPipeline() + : mStreamPipeline(nsnull), + mAudioPadSource(nsnull), + mAudioPadSink(nsnull), + mVideoPadSource(nsnull), + mVideoPadSink(nsnull) +{ +} Some member variables are not being set (eg. mStreamerDecoder) +gboolean nsGStreamerPipeline::BusCallBack(GstBus *aBus, + GstMessage *aMessage, + nsGStreamerPipeline *aMe) +{ Is this called on a non-main thread? If so how is thread safety assured when using member varaibles like: + aMe->mGStreamerState.PositionAtEOF(); +//////////////////////////////////////////////////////////////////////////////// +// Class that takes care of the GStreamer pipeline + Do any of the static member functions in this class get called on different threads? If so, how is access to methods/members done safely on the different threads? Please add comments about which members are main thread only or safe to read and write on other threads. +++ b/content/media/gstreamer/nsGStreamerPipelineState.h +//////////////////////////////////////////////////////////////////////////////// +// Class that takes care of the GStreamer pipeline state Expand on where this class is used, what it is for, what threads it can safely be used on, etc. Comment all methods/members that can or cannot be called on other threads. + // Destructor + ~nsGStreamerPipelineState(); Does this need to be virtual? Is it ever deleted using a pointer to one of its base classes? + // Tell state machine if it should play or stop(pause) + // bPlay = PR_TRUE => playback starts when conditions are right + // bPlay = PR_FALSE => playback is paused and will not start + virtual void PlayWhenPossible(PRBool aShouldPlay); bPlay should be aShouldPlay? +++ b/content/media/gstreamer/nsGStreamerVideoBin.cpp + +PRBool nsGStreamerVideoBin::Create( + nsGStreamerVideoPrerollCompleteInterface *aVideoPrerollComplete) +{ Everything I've said in other ::Create methods should be done here too. + aMe->mAspect = (pixel_aspect[0] == 0) ? + ((gdouble)aMe->mWidth/(gdouble)aMe->mHeight) : + ((gdouble)pixel_aspect[0] / + (gdouble)pixel_aspect[1] * + (gdouble)aMe->mWidth/(gdouble)aMe->mHeight); ... + aMe->mFramerate = (gdouble)pixel_aspect[0]/(gdouble)pixel_aspect[1]; Can pixel_aspect[1] ever be zero? If so, guard against division by zero. +++ b/content/media/gstreamer/nsGStreamerVideoBin.h + +//////////////////////////////////////////////////////////////////////////////// +// Class that takes care of the video sink bin + Same documentation requirements as mentioned previusly. + // Mutex to control the access to mIsScreenPreviouslyInvalidated + GMutex *mScreenInvalidMutex; + + // Condition that is set when screen has been invalidated + GCond *mScreenInvalidatedCond; Why not use NSPR locking types?
Reporter | ||
Comment 55•15 years ago
|
||
Somewhere, perhaps nsGStreamerDecoder.h, I'd like to see some comments describing the overall architecture of the GStreamer decoder. How all the classes work together, what the lifetime of all the various objects are, and what threads are used. As an example see nsOggDecoder.h. Something like the ascii diagram in that that maps the various states of the gstreamer decoder would be nice as well. I'll wait till the patch is cleaned up before reviewing further since it's a hard to tell in the 15,000 line patch where actual code changes are vs white space changes.
Reporter | ||
Updated•15 years ago
|
Attachment #406209 -
Flags: review?(chris.double) → review-
Reporter | ||
Comment 56•15 years ago
|
||
There needs to be a way of testing this backend. If the mochitests are run will this only use the Ogg backend for the ogg files rather than the GStreamer backend? If so you'll need to add test files that GStreamer will use. Do the tinderboxes have GStreamer installed?
It would be best if it was possible to run with GStreamer enabled for Ogg and Wave, so we can check that it passes all the existing tests.
Comment 58•15 years ago
|
||
As you know, the patch isn't complete yet, and hence wont pass any good testing - but I'm fine with changing it so it will be possible to use the existing tests to test it (at least partial) - to use GStreamer to run Ogg tests I guess we could just disable the "native" OGG decoder? I believe this can currently be done compile time - I assume that won't be good enough - will it be an idea to be able to turn the native ogg decoder and gstreamer on and off through the runtime system as well as compile time?
I think compile time is good enough. If --disable-ogg --disable-wave --enable-gstreamer are given, then the GStreamer backend should be able to play Ogg and Wave files.
Reporter | ||
Comment 60•15 years ago
|
||
(In reply to comment #59) > I think compile time is good enough. If --disable-ogg --disable-wave > --enable-gstreamer are given, then the GStreamer backend should be able to play > Ogg and Wave files. I'm pretty sure this will work the way things are #if'd in the patch.
Comment 61•15 years ago
|
||
It is definitely made with the intention that disabling ogg compile time will enable gstreamer :) So if it doesn't work that way it should be fixed to work that way. I'll not do anything for making it runtime selectable then. Came to think of, is it fine that GStreamer is enabled by a define, and ogg and wave are disabled by a define - or should it be similar for all? Another question is: Assuming the first version of this patch is going to be submitted without being functional complete or perfect - which things are a "must do" before it can be submitted?
Comment 62•15 years ago
|
||
I tried last patch on N900 device, <video autoplay="true" src="star_trek_720_400_2100kbps.avi" controls="true"/> And it does not work very well. it is end with dsp_thread: failed waiting for events Only first frame shown on the page, and nothing more happening
(In reply to comment #61) > Came to think of, is it fine that GStreamer is enabled by a define, and ogg and > wave are disabled by a define - or should it be similar for all? I think that's fine. > Assuming the first version of this patch is going to be submitted without being > functional complete or perfect - which things are a "must do" before it can be > submitted? That's a good question. Perhaps the best approach is to say that code that has been reviewed can land, even if it doesn't work.
Comment 64•15 years ago
|
||
(In reply to comment #62) > Created an attachment (id=408675) [details] > Output log for ms-video gstreamer play > > I tried last patch on N900 device, > <video autoplay="true" src="star_trek_720_400_2100kbps.avi" controls="true"/> > > And it does not work very well. > it is end with > dsp_thread: failed waiting for events > Only first frame shown on the page, and nothing more happening Interesting... Had a look at the log, nothing springs out as an explanation - are you running with the source file locally or from the web? Did you try to play the .avi file back with the build in player on the device? Is the file generating this behaviour publicly available?
Comment 65•15 years ago
|
||
(In reply to comment #63) > That's a good question. Perhaps the best approach is to say that code that has > been reviewed can land, even if it doesn't work. I'm not sure I understand your meaning with the above? - I'm seeing two extreme interpretations: 1) The code can be submitted "as is" as long as we have a list of the things that must be fixed in it later (could be ok, as it won't be enabled unless the build options are changed to include --enable-gstreamer) but there could still be stuff that _must_ be fixed before a submit - like if I mess up "public" files.... 2) Things caught in the review needs to be fixed, but the review is not concerned with the patch working or not. As I said, I didn't quite understand the comment, so the right answer could be something completely different...
Reporter | ||
Comment 66•15 years ago
|
||
(In reply to comment #65) > (In reply to comment #63) > 2) Things caught in the review needs to be fixed, but the review is not > concerned with the patch working or not. The patch needs to get r+ which means all review comments need to be addressed. Given that the gstreamer support will be disabled on landing then it doesn't need to be complete. I'd hope that videos can at least play though (seeking, etc not so important).
(In reply to comment #65) > 2) Things caught in the review needs to be fixed, but the review is not > concerned with the patch working or not. This is what I meant.
Comment 68•15 years ago
|
||
(In reply to comment #67) > (In reply to comment #65) > > 2) Things caught in the review needs to be fixed, but the review is not > > concerned with the patch working or not. > > This is what I meant. Thanks - I don't understand it, but I can comply to it :) I would be interested in discussing this to improve my understanding of the development process in Mozilla next time we meet in person - maybe in December?
There's no policy here, just one-off decision making. I think generally we want code that's in our tree to have been reviewed and not require more review before it gets enabled for real. Otherwise it's too difficult to track what reviews need to be done, and to actually get people to do the reviews and then address the review comments --- normally, getting your code on trunk is the reward that encourages you to go through the review process :-). However, apart from that, we want your code on trunk as soon as possible. So it's OK if it's incomplete, as long as we understand what's there and what's missing.
Comment 70•15 years ago
|
||
> Interesting... Had a look at the log, nothing springs out as an explanation - > are you running with the source file locally or from the web? source file and video.html file from web. > > Did you try to play the .avi file back with the build in player on the device? Yes, it one of pre-installed videos, and it works fine with builtin player. > > Is the file generating this behaviour publicly available? Not sure, I think you can try the same with some official preinstalled video from N900 device.
Comment 71•15 years ago
|
||
(In reply to comment #69) > There's no policy here, just one-off decision making. Which must also reflect some kind of policy, but never mind that now... > I think generally we want code that's in our tree to have been reviewed and not > require more review before it gets enabled for real. Otherwise it's too > difficult to track what reviews need to be done, and to actually get people to > do the reviews and then address the review comments --- normally, getting your > code on trunk is the reward that encourages you to go through the review > process :-). Yes, I agree with that, if we let anything into the trunk of sub-standard, then there is a very real risk that it will stay that way and not be fixed - but it is not always what I think that is the "Mozilla-way" thats why I have to ask :) > However, apart from that, we want your code on trunk as soon as possible. So > it's OK if it's incomplete, as long as we understand what's there and what's > missing. I think what I don't understand is where that hair fine line is - between what is "incomplete" and what is an "error" but luckily in this particular case that is not my problem as that will be up to the reviewer :)
Comment 72•15 years ago
|
||
(In reply to comment #70) Thanks, I'll try to reproduce it and find an explanation, it might be a while thou...
Comment 73•15 years ago
|
||
Also with default Nokia_N900.mp4 video, behavior is not very good. For whole video (1:15) it show ~3-5 frames, ~0.06 FPS.
Comment 74•15 years ago
|
||
(In reply to comment #73) > Also with default Nokia_N900.mp4 video, behavior is not very good. > For whole video (1:15) it show ~3-5 frames, ~0.06 FPS. Ahh performance - it is... should we say... an issue :) The GStreamer integration in the media framework is mainly a question of feeding it data in one end and removing decoded video frames in the other end so not much space for optimization there, there might be some performance boost by doing better synchronisation between the decoding and the actual rendering but my feeling is that the best path for optimisation is to optimise the surroundings. I have tried creating a small stand-alone program that only has the GStreamer integration pushing the data directly to an X-Window and using a file source for the data - this has significant better performance in target than what is seen when we run it with the overhead of Fennec :( - The plan is to be feature complete before I spend time optimizing the surroundings of the integration.
I don't see how the "browser overhead" can account for a 400X slowdown. Something must be fundamentally broken.
Reporter | ||
Comment 76•15 years ago
|
||
What is the CPU usage of fennec when it's playing a video on the N900?
Comment 77•15 years ago
|
||
(In reply to comment #75) > I don't see how the "browser overhead" can account for a 400X slowdown. > Something must be fundamentally broken. No, one shouldn't think browser overhead could account for that... and yes, something seems to be broken :)
Comment 78•15 years ago
|
||
(In reply to comment #76) > What is the CPU usage of fennec when it's playing a video on the N900? I'm not spending time on optimization at this point, so I won't be able to tell :)
Comment 79•15 years ago
|
||
I think we should also request list of supported mime-types from gstreamer, update Gecko-Content-Viewers, and make possible to open video supported by gstreamer as MediaDocument
Reporter | ||
Comment 80•15 years ago
|
||
If getting the list of supported mime-types from gstreamer is possible then can it be used to properly implement canPlayType? In the patch that's currently set to return 'maybe' for GStreamer iirc.
Comment 81•15 years ago
|
||
You can get a list of GStreamer's "caps types" from gstreamer, if you want. It's of dubious utility to do so. These look like, and in some cases are the same as, mime types. They are NOT in general mime types, and should NOT be used as mime types. There's no way to get a list of supported mime types from GStreamer. You could hard-code a mapping table from caps types to real mime types, and on canPlayType() respond appropriately if the type is in your table. Of course, this too is of dubious utility, since mime types on media files typically describe the container format, and "what you can play" is more typically tied to the codecs used inside that container.
RFC 4281 describes how to include codec information in MIME types. We support this for our built-in codecs.
Comment 83•15 years ago
|
||
Ah, right, I'd forgotten about that. You could certainly do something where you query gstreamer for demuxers, then decoders, then have some big messy table for mapping a mixture of that info into mime-types-with-codec information. There's not currently any way to find out complete detail about what a gst decoder supports - e.g. for an mpeg4 part 2 decoder, it won't tell you what profiles/levels are available. That could probably be added to gstreamer in the future, though. Realistically, nobody's going to provide that level of detail when using canPlayType anyway, so this is not a big loss.
Comment 84•15 years ago
|
||
(In reply to comment #50) This post is a reply to Comment #50 and #51 These are comments to the review comments, the things I'm not commenting on has either been fixed as requested, or have been changed in a way that renders the comment irelevant. ---- It would be helpfull if filename/linenumber was included for every review comments. ---- nsHTMLMediaElement.cpp (1243) // Check if any decoders knows about the mime type Fix gramma ^^^^ Do you mean know instead of knows? ---- nsGStreamerAudioBin.cpp (70) + mAudioBin = GST_BIN_CAST(gst_bin_new("mAudioBin")); + This can fail and mAudioBin is later used a few lines down in gst_object_sink. Detect and handle the failure here. Same with the following gst_element_factory_make calls. ^^^^ It is true that the lines like: gst_object_sink(gst_object_ref(GST_OBJECT(mAudioBin))); are using the members before they are checked - it should really not happen in the normal case and the above sequence can handle a 0 as input (it will complain to the console thou). So the postponing of the error handling is intentional... Adding checks before these lines will complicate the error handling without any benefit, and decrease the readability of the function significately, so as there are no real benefit of doing the check early, I'll leave it as is - unless I'm told a good reason not to. ----- nsGStreamerAudioBin.cpp (73) + // Create a buffer for the audio => Running in separate thread What does the 'running in separate thread' mean? How does the call to gst_element_factory_make cause it to run in a different thread? ^^^^ It means that it will run the elements in the bin in its own thread. If you want two things to run in seperate threads in GStreamer, you add a queue (buffer) between them. ----- nsGStreamerAudioBin.cpp (78) + // Find the best possible sink + if (!mAudioSink) { + mAudioSink = gst_element_factory_make("gconfaudiosink", "mAudioSink"); + } + if (!mAudioSink) { + mAudioSink = gst_element_factory_make("pulsesink", "mAudioSink"); + } ...etc... Can we make these sinks, and the order they're tried, a pref? Maybe a comma separated list of sink names in the order they should be tried. ^^^^ Sure we can, do you wish this to be done before the initial submit of the code? ----- nsGStreamerAudioBin.cpp (115) + // Take ownership of the elements + gst_object_sink(gst_object_ref(GST_OBJECT(mAudioBin))); + gst_object_sink(gst_object_ref(GST_OBJECT(mAudioBuffer))); + gst_object_sink(gst_object_ref(GST_OBJECT(mAudioConvert))); + gst_object_sink(gst_object_ref(GST_OBJECT(mAudioSink))); + gst_object_sink(gst_object_ref(GST_OBJECT(mAudioVolume))); Can we take ownership of these in the same place that we create them? ^^^^ Yes we can - but do you think it will improve readability (aka how easy the code is to understand for a 3'rd party)? Consider: // Create elements A a = new A; B b = new B; C c = new C; // Do something to elements foo(a); foo(b); foo(c); // Return something using all elements return a+b+c; Against: // Create and do something to A A a = new A; a.foo(); // Create and do something to B B b = new B; b.foo(); // Create and do something to C C c = new C; c.foo(); // Return something using all elements return a+b+c; ----- nsGStreamerAudioBin.cpp (202) +void nsGStreamerAudioBin::Destroy() +{ + if (mAudioBin) { + gst_object_unref(GST_OBJECT(mAudioBin)); + } + if (mAudioSink) { + gst_object_unref(GST_OBJECT(mAudioSink)); + } + if (mAudioConvert) { + gst_object_unref(GST_OBJECT(mAudioConvert)); + } + if (mAudioBuffer) { + gst_object_unref(GST_OBJECT(mAudioBuffer)); + } + if (mAudioVolume) { + gst_object_unref(GST_OBJECT(mAudioVolume)); + } gst_object_unref is safe to be called with a null object you don't need the tests. ^^^^ Yes, it's true that gst_object_unref can handle a 0 pointer, like the gst_object_ref in the create function can (with simelar logs btw) - but the difference is that the destroy function will be called with these being 0 under normal conditions -and that will clutter the log - whereas they in the create function won't be 0 under normal conditions - thats the reason they are protected by "if"'s in the destroy function and not in the create function. ---- nsGStreamerAudioBin.cpp (268) + gst_element_set_state(mAudioSink, GST_STATE_PAUSED); Handle or log failure of the state fails to be set. ^^^^ Is there a special reason that you would like a test on the return value for this particular function? I don't see it as being more fatal for the decoder than most of the other function calls that it make. ----- nsGStreamerAudioPropertiesInterface.h (40) +#ifndef nsGStreamerAudioPropertiesInterface_h__ +#define nsGStreamerAudioPropertiesInterface_h__ + +//////////////////////////////////////////////////////////////////////////////// +// Interface class to modify audio properties + Describe in more detail what this class is used for. Who should inherit from it? Who calls it? Does anyone maintain a reference to it - if so, who controls ownership of it. Will an instance of this class ever have delete called on it from an nsGStreamerAudioPropertiesInterface pointer? If so, does it need a virtual destructor? ^^^^ I have added a description about lifetime etc. and the purpose of it - but with regards to who should inherit from it... is what you actually want a design document? - where you can see all the classes and their relationship? Such a document could either be in a stand-alone file with pretty graphics or embedded in one of the header files with square ascii graphics. I mean the only one that should inherit from it, is the class that does it - this is not an interface that has any relevance outside of the GStreamer media decoder integration - non of the code has... I don't think it will add any value, except make a maintainance nightmare if we start to dublicate, out of context, what is obvious from the code in header files. ----- nsGStreamerConfig.h (47) +// Define the max size of video elements in pixels +#define MAX_VIDEO_WIDTH 1024 +#define MAX_VIDEO_HEIGHT 1024 Where did these numbers come from? Does this mean 1080P HD video won't play? ^^^^ These are magic numbers, there sole purpose is an attempt not to be able to kill the browser by embedding an insane size of video on a page (1001 ways to kill Fennec) - I have no preference for these numbers over others - would love to have a dynamic solution that did estimates of max size based on available memory, but that is probably for an advanced implementation. I can see your point with the HD video you mentioned - I have increased the values to 2048 in the patch - which is pretty much for a handheld device anyway - maybe this is a candiate for about:config ? or at least a compile time switch? ------ nsGStreamerConfig.h (50) +// Define how much data we should try to read at a time +#define GST_BUFFER_READ_CHUNK_SIZE 1024*16 What's the origin of this size? Maybe comment about why you chose that value. ^^^^ No origins, just need a size - you might remember that I asked you about a good value for this one during our review session? But anyway - it's one of these values that can be manipulated during an optimizing session, to see what gives the best performance. I have added a comment to the value that changing it could change performance in both directions. ----- nsGStreamerDecodeBin.cpp (47) +nsGStreamerDecodeBin::nsGStreamerDecodeBin() + : mDecodeBin(nsnull) +{ +} Why is the other member variable, mPadDest, not set to an initial value? ^^^^ The reason mDecodeBin is initalised to null - is that an uninitialised pointer could create problems when a call to the destroy call is made which would be normal if anything in the creation of the pipeline fails - mPadDest don't have this sensitivity, so there is no need to initialise it. However if you insist I can initialise it to 0. ----- unknown (??) + // Constructor + // Destructor Comments like this aren't really needed. It's obvious when it's a constructor or a destructor. ^^^^ Yes, it is obvious. But I'm not judging for obviousness when I write comments on my prototypes, and I prefer to have all my functions documented. -----
Reporter | ||
Comment 85•15 years ago
|
||
(In reply to comment #84) > Do you mean know instead of knows? Yes. Or you can reword it however you want so it's readable. > So the postponing of the error handling is intentional... If it's safe, that's fine. > > Sure we can, do you wish this to be done before the initial submit of the code? I'd prefer that but if not raise a bug for this to be done and that'll be fine. > Yes we can - but do you think it will improve readability (aka how easy the > code is to understand for a 3'rd party)? I personally find it more readable but it's your call if you're going to be the one maintaining it. If you strongly prefer your way, go for it. > Yes, it's true that gst_object_unref can handle a 0 pointer, like the > gst_object_ref in the create function can (with simelar logs btw) - but the > difference is that the destroy function will be called with these being 0 under > normal conditions -and that will clutter the log - whereas they in the create > function won't be 0 under normal conditions - thats the reason they are > protected by "if"'s in the destroy function and not in the create function. I have no problem with this reasoning. That's fine. > Is there a special reason that you would like a test on the return value for > this particular function? I don't see it as being more fatal for the decoder > than most of the other function calls that it make. A couple of the third party libraries in the Ogg code don't check return values of supposedly safe calls in other third party libraries that they use and this has been a constant source of problems and vulnerabilities. I'd like to err on the side of caution of checking return values of third party libraries where they document that they can fail. > I have added a description about lifetime etc. and the purpose of it - but > with regards to who should inherit from it... is what you actually want a > design document? - where you can see all the classes and their relationship? > Such a document could either be in a stand-alone file with pretty graphics or > embedded in one of the header files with square ascii graphics. I just want better comments. Feel free to include what you want and I'll let you know if it's enough. > These are magic numbers, there sole purpose is an attempt not to be able to > kill the browser by embedding an insane size of video on a page (1001 ways to > kill Fennec) - I have no preference for these numbers over others - would love > to have a dynamic solution that did estimates of max size based on available > memory, but that is probably for an advanced implementation. The Ogg decode has similar magic numbers for detecting 'too large' videos. Maybe move these into nsMediaDecoder.h and use them so the different backends are at least consistent. > - maybe this is a candiate for about:config ? or at least a compile time > switch? Or this. Up to you. > I have added a comment to the value that changing it could change performance > in both directions. Great, that's fine. > > However if you insist I can initialise it to 0. Thanks, please do so.
Comment 86•15 years ago
|
||
(In reply to comment #85) > (In reply to comment #84) > > Is there a special reason that you would like a test on the return value for > > this particular function? I don't see it as being more fatal for the decoder > > than most of the other function calls that it make. > > A couple of the third party libraries in the Ogg code don't check return values > of supposedly safe calls in other third party libraries that they use and this > has been a constant source of problems and vulnerabilities. I'd like to err on > the side of caution of checking return values of third party libraries where > they document that they can fail. So, what you are asking is not specific to this function, but for all functions that can return an error value you want at least a log to happen if the call fails? (alternatively the error needs to be handled). Note that the current implementation is not ignoring failure, rather it assumes that a failure in one place will lead to a failure in another - where its either handled or ignored if there is no sensible way to react to the error. > > I have added a description about lifetime etc. and the purpose of it - but > > with regards to who should inherit from it... is what you actually want a > > design document? - where you can see all the classes and their relationship? > > Such a document could either be in a stand-alone file with pretty graphics or > > embedded in one of the header files with square ascii graphics. > > I just want better comments. Feel free to include what you want and I'll let > you know if it's enough. > Ok so you want an unspecified level of comments, I'll add some more generic comments to help get a birds eye view of the code, it might take some iterations before we agree on a level, but lets see how it goes. > > These are magic numbers, there sole purpose is an attempt not to be able to > > kill the browser by embedding an insane size of video on a page (1001 ways to > > kill Fennec) - I have no preference for these numbers over others - would love > > to have a dynamic solution that did estimates of max size based on available > > memory, but that is probably for an advanced implementation. > > The Ogg decode has similar magic numbers for detecting 'too large' videos. > Maybe move these into nsMediaDecoder.h and use them so the different backends > are at least consistent. That solution is fine with me - and yes, there are no reason the different decoders should use different values.
Comment 87•15 years ago
|
||
Reply to comment #52 ----- + // It's done in this way to allow GStreamer data types in the "real" class + static nsGStreamerDecoder *makeNewInstance(); I'm not sure what you mean by "to allow GStreamer data types in the 'real' class". Can you expand on this? ^^^^ The new comment now reads: // Function to create a new instance of the GStreamer decoder // // NOTE: If we did it by letting the user instantiate an element of // this class we would have to drag a lot of GStreamer specific // header files in together with this header file. ----- nsGStreamerElementWrapper.h (multiple) Add comments to the methods to explain what threads they can safely be called on. Add assertions to the methods that are main thread only to ensure they are called on the main thread only. eg: NS_ASSERTION(NS_IsMainThread(), "Should be called on main thread"); + nsHTMLMediaElement* mElement; Add a comment about what threads can use mElement (ie. main thread only?). This is not a counted pointer - how is ownership of mElement maintained? For an example of the type of comment needed, see the 'mElement' comment in nsMediaDecoder.h which describes the thread usage and why it's not reference counted. ^^^^ I have added some new comments that hopefully makes your concerns disappear. ----- Reply to comment #53 There are some repetition of patterns here that have been explained previously, these won't be repeated. ----- nsGStreamerLib.cpp (209) +void nsGStreamerLib::BringGStreamerDown() +{ + switch (mGStreamerLibStatus) Reading mGStreamerLibStatus without taking the mGStreamerMutex lock. The switch also reads mNumberOfInstances. + // The gstreamer lib is now down + mGStreamerLibStatus = GSTREAM_GDOWN_LIB_DOWN; This should be protected by the lock? ^^^^ No, as it is the callers responsibility to take the lock before calling the function as you noted your self later - but that wasn't clear from the comments - comment has been added to the header file to clarify this as you suggest ----- nsGStreamerLib.h (87) + + // The last thing we should do, befor termination of the program + // is to terminate the GStreamer library + class Terminator + { + public: + ~Terminator(); + }; + + static Terminator terminator; Why not do this in the destructor of nsGStreamerLib instead of having the helper object? ^^^^ Hehe ;) I'm very happy you asked that question. The reason I need a static object to take the GStreamer library down, is that it can't be restarted after being shut down. It is actually documented in the GStreamer documentation - thou, not as clear as it could be as one of the kind souls at #gstreamer pointed out to me. For reference the current GStreamer docs say: "After this call GStreamer (including this method) should not be used anymore." - that includes also the init function (and yes I learned that the hard way)! So taking the GStreamer lib down, needs to be the last thing that is done in the browser - at least it needs to be done at a time where we are sure we wont need the GStreamer lib any more - I can think of two cases of this - at the time global objects are destroyed, and after something so fatal has happened that we won't even try to run the GStreamer lib any more - which are the two cases it is called. (I have now stated that the GStreamer lib can't be restarted again in the comment to nsGStreamerLib::Terminator) ----- Reply to comment #54 In this comment we have the issue of uninitialised members again - it's not a functional problem as they will always be set before being used - but I fully understand the reasoning as to why there is a wish to initialise everything (maintenance over performance - I won't go into that discussion). But if we are serious about initialising everything - what is the reason then that we don't do it per default through the memory allocation system? - (This would even improve security) I'll write some kind of documentation on all the classes their relationship, thread safty etc. so I'll not comment further on this in the below. ----- +++ b/content/media/gstreamer/nsGStreamerPipeline.cpp + +nsGStreamerPipeline::nsGStreamerPipeline() + : mStreamPipeline(nsnull), + mAudioPadSource(nsnull), + mAudioPadSink(nsnull), + mVideoPadSource(nsnull), + mVideoPadSink(nsnull) +{ +} Some member variables are not being set (eg. mStreamerDecoder) ^^^^ Are there others? - I can only spot the one ----- nsGStreamerPipelineState.h (63) + // Destructor + ~nsGStreamerPipelineState(); Does this need to be virtual? Is it ever deleted using a pointer to one of its base classes? ^^^^ No and no. ----- nsGStreamerPipelineState.h (77) + // Tell state machine if it should play or stop(pause) + // bPlay = PR_TRUE => playback starts when conditions are right + // bPlay = PR_FALSE => playback is paused and will not start + virtual void PlayWhenPossible(PRBool aShouldPlay); bPlay should be aShouldPlay? ^^^^ Indeed it should - that should teach me not to have the "exclude comments" option on per default when I do search-n-replace. ----- nsGStreamerVideoBin.cpp (342) + aMe->mAspect = (pixel_aspect[0] == 0) ? + ((gdouble)aMe->mWidth/(gdouble)aMe->mHeight) : + ((gdouble)pixel_aspect[0] / + (gdouble)pixel_aspect[1] * + (gdouble)aMe->mWidth/(gdouble)aMe->mHeight); ... + aMe->mFramerate = (gdouble)pixel_aspect[0]/(gdouble)pixel_aspect[1]; Can pixel_aspect[1] ever be zero? If so, guard against division by zero. ^^^^ That's plain evil! - nice spot btw. ----- nsGStreamerVideoBin.hpp (124) + // Mutex to control the access to mIsScreenPreviouslyInvalidated + GMutex *mScreenInvalidMutex; + + // Condition that is set when screen has been invalidated + GCond *mScreenInvalidatedCond; Why not use NSPR locking types? ^^^^ No good reason here - except GStreamer is depending on GLib and all the examples in the GStreamer doc uses GLib - I thought about changing it but never got around to it - will do :)
Updated•15 years ago
|
tracking-fennec: 1.0+ → 1.0-
Comment 88•15 years ago
|
||
In reply to comment #84. It is not safe to call gst_object_unref(NULL); - it would trigger a Glib-Critical and the safe-guard can be compiled out even. Mike K. would it be possible for you to stick a GST_DEBUG_BIN_TO_DOT_FILE_WITH_TS(mStreamPipeline, 0, "firefox") somewhere where the whole pipeline is setup (e.g. gst_fennecvideosink_change_state or ideally simillar for the actual pipleine object) and then run GST_DEBUG_DUMP_DOT_DIR=$PWD ./firefox You will end up with graphviz dot files for the actual gst pipleine. Convert them to png/svg: dot -Tpng xxx.dot -oxxx.png dot -Tsvg xxx.dot -oxxx.svg and attach a good one here. That would help to get the big picture how you actualy do things.
Comment 89•15 years ago
|
||
What exactly is the status on this bug at this time? Reading through these comments doesn't exactly give a great picture of what the progress is on this feature.
Comment 90•15 years ago
|
||
It's being worked on, addressing the issues pointed out in the review and from discussions with roc. The major part of the work going on currently is related to coding style rather than functionality. The most important remaining issue as I see it is the performance on the N900 - this seem to have degraded over time - some months ago I ported the patch to a stand alone program, this had significantly better performance (as in a useful frame rate). It remains to be investigated exactly why it is so slow when running within Fennec.
Comment 91•15 years ago
|
||
(In reply to comment #84) > (In reply to comment #50) nsGStreamerConfig.h (47) +// Define the max size of video elements in pixels +#define MAX_VIDEO_WIDTH 1024 +#define MAX_VIDEO_HEIGHT 1024 Is that code only for handheld device? I see monitors bigger than 2048 x 2048 Please see http://www.nextag.com/Samsung-Syncmaster-305T-30-536319142/prices-html?nxtg=a9760a1c0514-A0AF65F162A79CAF 2560x1600 resolution, http://www.wired.com/gadgetlab/2007/11/toshiba-shows-c/ 3,840 x 2,400 pixel you may want this to be made as preference
Comment 93•15 years ago
|
||
(In reply to comment #90) > The most important remaining issue as I see it is the performance on the N900 - > this seem to have degraded over time - some months ago I ported the patch to a > stand alone program, this had significantly better performance (as in a useful > frame rate). It remains to be investigated exactly why it is so slow when > running within Fennec. Does this problem also exist when GStreamer is the engine used for HTML5 video/audio in regular desktop Firefox? Or is this issue specific to Fennec?
Comment 94•15 years ago
|
||
(In reply to comment #93) > (In reply to comment #90) > > The most important remaining issue as I see it is the performance on the N900 - > > this seem to have degraded over time - some months ago I ported the patch to a > > stand alone program, this had significantly better performance (as in a useful > > frame rate). It remains to be investigated exactly why it is so slow when > > running within Fennec. > > Does this problem also exist when GStreamer is the engine used for HTML5 > video/audio in regular desktop Firefox? Or is this issue specific to Fennec? The framerate is fine when running a Fennec build on my laptop (Intel Core 2 Duo 2.13 GHz) - So on my PC, I don't see any reason why a regular Firefox build should have any problems. But I'm sure if you try on a slow enough computer you will eventually get an unacceptable frame rate... However I don't think that is an answer to your question - but I don't have any data to give a better one.
Comment 95•15 years ago
|
||
(In reply to comment #94) > (In reply to comment #93) > > (In reply to comment #90) > > > The most important remaining issue as I see it is the performance on the N900 - > > > this seem to have degraded over time - some months ago I ported the patch to a > > > stand alone program, this had significantly better performance (as in a useful > > > frame rate). It remains to be investigated exactly why it is so slow when > > > running within Fennec. > > > > Does this problem also exist when GStreamer is the engine used for HTML5 > > video/audio in regular desktop Firefox? Or is this issue specific to Fennec? > > The framerate is fine when running a Fennec build on my laptop (Intel Core 2 > Duo 2.13 GHz) - So on my PC, I don't see any reason why a regular Firefox build > should have any problems. But I'm sure if you try on a slow enough computer > you will eventually get an unacceptable frame rate... > > However I don't think that is an answer to your question - but I don't have any > data to give a better one. Then it sounds like the fault lies in Fennec. How much resources does Fennec consume on the N900?
Comment 96•15 years ago
|
||
(In reply to comment #95) > (In reply to comment #94) > > (In reply to comment #93) > > > (In reply to comment #90) > > > > The most important remaining issue as I see it is the performance on the N900 - > > > > this seem to have degraded over time - some months ago I ported the patch to a > > > > stand alone program, this had significantly better performance (as in a useful > > > > frame rate). It remains to be investigated exactly why it is so slow when > > > > running within Fennec. > > > > > > Does this problem also exist when GStreamer is the engine used for HTML5 > > > video/audio in regular desktop Firefox? Or is this issue specific to Fennec? > > > > The framerate is fine when running a Fennec build on my laptop (Intel Core 2 > > Duo 2.13 GHz) - So on my PC, I don't see any reason why a regular Firefox build > > should have any problems. But I'm sure if you try on a slow enough computer > > you will eventually get an unacceptable frame rate... > > > > However I don't think that is an answer to your question - but I don't have any > > data to give a better one. > > Then it sounds like the fault lies in Fennec. How much resources does Fennec > consume on the N900? hmm... I haven't been too concerned with performance so far - but just did a test run with "top" running while playing a small video on the N900, the outcome was (top 3 on CPU usage): fennec/fennec: ~16% Mem ~50% CPU /usr/bin/Xorg -logfile: /tmp/X ~13% Mem ~24% CPU /usr/bin/pulseaudio --system: 2% Mem ~17% CPU May I ask how you come to the conclusion that the fault is in Fennec?
Comment 97•15 years ago
|
||
(In reply to comment #96) > (In reply to comment #95) > > (In reply to comment #94) > > > (In reply to comment #93) > > > > (In reply to comment #90) > > > > > The most important remaining issue as I see it is the performance on the N900 - > > > > > this seem to have degraded over time - some months ago I ported the patch to a > > > > > stand alone program, this had significantly better performance (as in a useful > > > > > frame rate). It remains to be investigated exactly why it is so slow when > > > > > running within Fennec. > > > > > > > > Does this problem also exist when GStreamer is the engine used for HTML5 > > > > video/audio in regular desktop Firefox? Or is this issue specific to Fennec? > > > > > > The framerate is fine when running a Fennec build on my laptop (Intel Core 2 > > > Duo 2.13 GHz) - So on my PC, I don't see any reason why a regular Firefox build > > > should have any problems. But I'm sure if you try on a slow enough computer > > > you will eventually get an unacceptable frame rate... > > > > > > However I don't think that is an answer to your question - but I don't have any > > > data to give a better one. > > > > Then it sounds like the fault lies in Fennec. How much resources does Fennec > > consume on the N900? > > hmm... I haven't been too concerned with performance so far - but just did a > test run with "top" running while playing a small video on the N900, the > outcome was (top 3 on CPU usage): > > fennec/fennec: ~16% Mem ~50% CPU > /usr/bin/Xorg -logfile: /tmp/X ~13% Mem ~24% CPU > /usr/bin/pulseaudio --system: 2% Mem ~17% CPU > > May I ask how you come to the conclusion that the fault is in Fennec? Well, you said that Fennec ran fine on a faster machine, and the N900 is a lot slower. And also you mentioned that the performance was a lot better when you used a standalone program. GStreamer wouldn't change between using Fennec and the standalone application, so therefore, the bottleneck has to be in Fennec. Of course, that is rather flimsy if GStreamer itself had really bad performance. How much resources does GStreamer consume on the N900?
Comment 98•15 years ago
|
||
(In reply to comment #97) > (In reply to comment #96) > > (In reply to comment #95) <snip> > > > > hmm... I haven't been too concerned with performance so far - but just did a > > test run with "top" running while playing a small video on the N900, the > > outcome was (top 3 on CPU usage): > > > > fennec/fennec: ~16% Mem ~50% CPU > > /usr/bin/Xorg -logfile: /tmp/X ~13% Mem ~24% CPU > > /usr/bin/pulseaudio --system: 2% Mem ~17% CPU > > > > May I ask how you come to the conclusion that the fault is in Fennec? > > Well, you said that Fennec ran fine on a faster machine, and the N900 is a lot > slower. And also you mentioned that the performance was a lot better when you > used a standalone program. GStreamer wouldn't change between using Fennec and > the standalone application, so therefore, the bottleneck has to be in Fennec. > > Of course, that is rather flimsy if GStreamer itself had really bad > performance. How much resources does GStreamer consume on the N900? The corresponding numbers for the stand alone program (gstreamer_info) are: ./gstreamer_info: ~7% Mem ~26% CPU /usr/bin/pulseaudio --system ~2% Mem ~19% CPU /usr/bin/Xorg -logfile /tmp/X: 6% Mem ~7% CPU So significantly less CPU usage for the stand alone program, even it can produce a higher framerate (Note: It's an older version of the GStreamer integration that is used, but I wouldn't expect the changes made to influence the numbers in any significant way) --- When you said Fennec was at fault, I was not sure if you meant vs. FireFox, you didn't so I agree with you.
Comment 99•15 years ago
|
||
Mike, could it be that you use xvimagesink in the standalone and ffmpegcolorspace + ximagesink in the fennec version? As I said earlier having a pipeline layout (dot->png/svg) would be helpful.
Comment 100•15 years ago
|
||
I was not able to generate a .dot file on the N900, so the attached is generated with a PC build of Fennec. Note that when playing back on the N900 the GStreamer integration performs better than the "native" OGG decoder in Fennec (in the sense that with my test file the native decoder has gaps in the audio, where as when using GStreamer the audio is "fluent" without gaps - both decoders plays the video part more like a series of still images than a video)
Comment 101•15 years ago
|
||
(In reply to comment #99) > Mike, could it be that you use xvimagesink in the standalone and > ffmpegcolorspace + ximagesink in the fennec version? > > As I said earlier having a pipeline layout (dot->png/svg) would be helpful. The standalone program uses a GStreamer "filesrc" to provide data, and a "ffmpegcolorspace", "xvimagesink" pair for playback - some time ago I tried this same setup within Fennec, it gave better performance (more FPS) but still significantly worse than what I see in the standalone app. The pipeline layout is attached in the previous post.
Comment 102•15 years ago
|
||
(In reply to comment #101) > (In reply to comment #99) > > Mike, could it be that you use xvimagesink in the standalone and > > ffmpegcolorspace + ximagesink in the fennec version? > > > > As I said earlier having a pipeline layout (dot->png/svg) would be helpful. > > The standalone program uses a GStreamer "filesrc" to provide data, and a > "ffmpegcolorspace", "xvimagesink" pair for playback - some time ago I tried > this same setup within Fennec, it gave better performance (more FPS) but still > significantly worse than what I see in the standalone app. If you use xvimagesink, you won't need the ffmpegcolorspace. At least on the N900 I would strongly advise to not use it unless needed. What formats can FennecVideoSink consume? Does it do software composition? On the audioside: Remove volume and use volume property on pulseaudio (that is on N900 I would suggest to configure pulsesink as the audiosink and not use gconfaudiosink - maybe a configure option) remove audioconvert, pulse can take all raw formats you might not even need the queue in mAudioBin. Otherwise this is fine. One small question though. Do you need the app-src, you can't you use gst_element_make_from_uri() and let gst fetch the data directly? > > The pipeline layout is attached in the previous post.
Comment 103•15 years ago
|
||
I am somewhat confused as to whether this patch is only meant for Fennec, or if it is also supposed to land in mainstream (desktop) Firefox. If so, is it targetted for a specific release of Firefox? I would greatly appreciate the ability to not have to resort to OGG video in combination with HTML5 video (MPEG-4/H.264 comes to mind).
Comment 104•15 years ago
|
||
I agree to Sjoerd Tieleman. If the N900 mobile device is supported and Nokia gives money or support in return, that's nice. Nevertheless I see the main purpose of this patch in the mainstream (desktop) Firefox. While many people want to make you believe OGG video quality is good, fact is, it's nowhere near a good h.264 encoding (x264). I am looking forward to GStreamer support and hope it's going to reduce Flash dependency. =)
Comment 105•15 years ago
|
||
My exclusive focus for this GStreamer integration has been to make it work with Fennec. Not enabling it in Firefox on platforms where GStreamer has been ported to, is as far as I know a political decision. My personal view is that we should enable it in Firefox too to give the users more choices (I don't want to see Firefox support less formats than any other browser, I rather see that if a user has some content, then he knows he can rely on Firefox/Fennec to play it back), but I have heard good arguments for not doing so. Maybe someone can repeat the arguments against enabling GStreamer in the mainstream Firefox?
Comment 106•15 years ago
|
||
On the contrary, i think using gstreamer on linux, even with some limitations like only allowing to read Ogg files, would be beneficial because it would solve a lot of headaches with sound configuration (alsa vs oss vs pulseaudio vs..., correct selection of the proper sound device when there are several, etc.).
Comment 107•15 years ago
|
||
(In reply to comment #105) > My exclusive focus for this GStreamer integration has been to make it work with > Fennec. Oh really? Why doesn't this surprise? When will someone from Mozilla confess that nothing has really happened there for 18 months that isn't focused on some mobiles browser that can only be run on very few phones? I live in Australia and largely engage with Mozilla merely via Planet Mozilla however even with that limited exposure it has been drop-dead obvious how much focus has gone onto Fennec. It's bad enough that Mozilla doesn't innovate, rather it lets extension developers innovate for it, but now it's hardly even working on Firefox. I realise that supposedly anything coded for Fennec can be compiled into Firefox but ... >Not enabling it in Firefox on platforms where GStreamer has been > ported to, is as far as I know a political decision. Typical! I argued the case for DirectShow support for H.264 on Windows given that Microsoft delivers that codec for free via Win 7. This was before the current tacky video support was released. Nobody had enough fortitude to think beyond the same old dogma for not making such a move. Then Google went and delivered H.264 video codecs as well! Mozilla I love Firefox but you're making a lot of decisions that are compromising it's future. > My personal view is that we should enable it in Firefox too to give the users > more choices (I don't want to see Firefox support less formats than any other > browser, I rather see that if a user has some content, then he knows he can > rely on Firefox/Fennec to play it back), but I have heard good arguments for > not doing so. Arguments against giving Firefox multi-codec support are one thing. However when the Firefox marketing spin then cranks up the hyperbole about 3.5 "enabling video for the web" I feel a rush of semi-digested food forcing it's way up my esophagus. Oops, wait, it is usually about now when people reading my comments start to think that it's a good idea to tell me not to include non-technical commentary in bugzilla. I better shut up. > Maybe someone can repeat the arguments against enabling GStreamer in the > mainstream Firefox? Careful Mike, that sounds dangerously like you are encouraging discussion in a bug fixing posting. Arrgh, watch out for the bugzilla thought police email you'll get any second :)
Comment 108•15 years ago
|
||
(In reply to comment #107) > (In reply to comment #105) <SNIP> > > Maybe someone can repeat the arguments against enabling GStreamer in the > > mainstream Firefox? > > Careful Mike, that sounds dangerously like you are encouraging discussion in a > bug fixing posting. Arrgh, watch out for the bugzilla thought police email > you'll get any second :) *LOL* Yes, I was considering the discussion to be moved to the Newsgroup, we really need a unified system... ARggghhh... wrong place to state such things - "All is well with the system we have" sorry. Anyone to start a discussion in the newsgroups? - Meanwhile I will try to figure out why my performance measurements on the integration fluctuates 40-50% which makes it very hard to establish a baseline to see how much the suggestions by Stefan Kost are improving the FPS....
Comment 109•15 years ago
|
||
(In reply to comment #107) > (In reply to comment #105) > > My exclusive focus for this GStreamer integration has been to make it work with > > Fennec. > > Oh really? Why doesn't this surprise? When will someone from Mozilla confess > that nothing has really happened there for 18 months that isn't focused on some > mobiles browser that can only be run on very few phones? I live in Australia > and largely engage with Mozilla merely via Planet Mozilla however even with > that limited exposure it has been drop-dead obvious how much focus has gone > onto Fennec. > > It's bad enough that Mozilla doesn't innovate, rather it lets extension > developers innovate for it, but now it's hardly even working on Firefox. > > I realise that supposedly anything coded for Fennec can be compiled into > Firefox but ... > I think the main reason for the hyperfocus on Fennec is because Gecko is a huge memory hog, and it will take a lot of work to whip Gecko into shape for it to be decent to use on mobile devices. The side effect of this is that desktop Firefox memory and CPU usage is likely to go way down. > >Not enabling it in Firefox on platforms where GStreamer has been > > ported to, is as far as I know a political decision. > > Typical! I argued the case for DirectShow support for H.264 on Windows given > that Microsoft delivers that codec for free via Win 7. This was before the > current tacky video support was released. Nobody had enough fortitude to think > beyond the same old dogma for not making such a move. Then Google went and > delivered H.264 video codecs as well! Mozilla I love Firefox but you're making > a lot of decisions that are compromising it's future. > I personally don't think supporting DirectShow in Windows Firefox directly is a good idea because many codecs for Windows DirectShow aren't designed very well to handle streaming. Or not well overall. In any case, GStreamer can bridge Windows DirectShow into itself, so if GStreamer was enabled for Windows Firefox, you could still get to use the Windows H.264 codec. > > My personal view is that we should enable it in Firefox too to give the users > > more choices (I don't want to see Firefox support less formats than any other > > browser, I rather see that if a user has some content, then he knows he can > > rely on Firefox/Fennec to play it back), but I have heard good arguments for > > not doing so. > > Arguments against giving Firefox multi-codec support are one thing. However > when the Firefox marketing spin then cranks up the hyperbole about 3.5 > "enabling video for the web" I feel a rush of semi-digested food forcing it's > way up my esophagus. > > Oops, wait, it is usually about now when people reading my comments start to > think that it's a good idea to tell me not to include non-technical commentary > in bugzilla. I better shut up. > I really don't agree with not enabling GStreamer for Windows, Mac, and Linux. On Linux, GStreamer is always there, and the work required to get HTML5 video and audio to work is greatly reduced with GStreamer used (Firefox doesn't seem to like to always play nice with PulseAudio, but that's only expected, since it really only works with raw ALSA). On Windows, GStreamer would allow users to plug in their own codecs, even if Mozilla themselves only shipped the Ogg codecs. And on Mac, GStreamer would allow QuickTime codecs to plug into Firefox. Additionally, it would probably be easier to support subtitles and the myriad of containers that movies come in. For example, I have a collection of anime video in the Matroska container that use the Ogg Theora/Vorbis codecs along with providing subtitles. It would be very nice if Firefox could support Matroska, subtitles, and multi-audio stream videos. English isn't the only language in the world, as Mozilla knows very well! ;) > > Maybe someone can repeat the arguments against enabling GStreamer in the > > mainstream Firefox? > > Careful Mike, that sounds dangerously like you are encouraging discussion in a > bug fixing posting. Arrgh, watch out for the bugzilla thought police email > you'll get any second :) I think the main reasons against enabling GStreamer in mainstream Firefox were political, rather than technical, because the original patch for GStreamer backend in HTML5 <video> and <audio> was designed for desktop Firefox. (In reply to comment #108) > (In reply to comment #107) > > (In reply to comment #105) > <SNIP> > > > Maybe someone can repeat the arguments against enabling GStreamer in the > > > mainstream Firefox? > > > > Careful Mike, that sounds dangerously like you are encouraging discussion in a > > bug fixing posting. Arrgh, watch out for the bugzilla thought police email > > you'll get any second :) > > *LOL* Yes, I was considering the discussion to be moved to the Newsgroup, we > really need a unified system... ARggghhh... wrong place to state such things - > "All is well with the system we have" sorry. Anyone to start a discussion in > the newsgroups? - > > Meanwhile I will try to figure out why my performance measurements on the > integration fluctuates 40-50% which makes it very hard to establish a baseline > to see how much the suggestions by Stefan Kost are improving the FPS.... Newsgroups... :( Don't we have an IRC channel or something we could discuss this in? Realtime discussion is so much better than newsgroups, and mailing lists are EVIL!!!!
Comment 110•15 years ago
|
||
(In reply to comment #109) > Additionally, it would probably be easier to support subtitles and the myriad > of containers that movies come in. For example, I have a collection of anime > video in the Matroska container that use the Ogg Theora/Vorbis codecs along > with providing subtitles. It would be very nice if Firefox could support > Matroska, subtitles, and multi-audio stream videos. English isn't the only > language in the world, as Mozilla knows very well! ;) > > I think the main reasons against enabling GStreamer in mainstream Firefox were > political, rather than technical, because the original patch for GStreamer > backend in HTML5 <video> and <audio> was designed for desktop Firefox. Absolutely. There is no reason for Mozilla to reinvent the wheel. Just use the existing GStreamer backend. Furthermore Mozilla can live up to it's political agenda to only support the "free" codes natively. While h.264, AAC and so forth (in short all the good stuff with awesome compression ratio, popularity, and hardware support) are decoded with GStreamer. > Newsgroups... :( > Don't we have an IRC channel or something we could discuss this in? Realtime > discussion is so much better than newsgroups, and mailing lists are EVIL!!!! See http://irc.mozilla.org/ How about irc://irc.mozilla.org/mozdev ? Perhaps we can kick off a discussion :) Lg kwinz
Comment 111•15 years ago
|
||
(In reply to comment #110) > (In reply to comment #109) > > Additionally, it would probably be easier to support subtitles and the myriad > > of containers that movies come in. For example, I have a collection of anime > > video in the Matroska container that use the Ogg Theora/Vorbis codecs along > > with providing subtitles. It would be very nice if Firefox could support > > Matroska, subtitles, and multi-audio stream videos. English isn't the only > > language in the world, as Mozilla knows very well! ;) > > > > I think the main reasons against enabling GStreamer in mainstream Firefox were > > political, rather than technical, because the original patch for GStreamer > > backend in HTML5 <video> and <audio> was designed for desktop Firefox. > > Absolutely. There is no reason for Mozilla to reinvent the wheel. Just use the > existing GStreamer backend. > Furthermore Mozilla can live up to it's political agenda to only support the > "free" codes natively. While h.264, AAC and so forth (in short all the good > stuff with awesome compression ratio, popularity, and hardware support) are > decoded with GStreamer. > Exactly. Though, Mozilla still wouldn't have to distribute the "other" codecs with their GStreamer package. The "free" codecs could be included in their GStreamer distribution included with Firefox. Opera is taking that route too: http://my.opera.com/core/blog/2009/12/31/re-introducing-video However, they have no plans to include the DirectShow and QuickTime bridges for GStreamer-win32 and GStreamer-OSX. > > Newsgroups... :( > > Don't we have an IRC channel or something we could discuss this in? Realtime > > discussion is so much better than newsgroups, and mailing lists are EVIL!!!! > > See http://irc.mozilla.org/ > How about irc://irc.mozilla.org/mozdev ? > Perhaps we can kick off a discussion :) > I think irc://irc.mozilla.org/firefox would be a better choice for discussion...
Reporter | ||
Comment 112•15 years ago
|
||
I think the first step towards evaluating the suitability of gstreamer across all platforms should be to get it working, reviewed and landed. Then we CSM see how well it works on other platforms and codecs. What would be the plan got other platforms lilt os/2, aix and solaris? Do they have gstreamer ports?
Comment 113•15 years ago
|
||
(In reply to comment #112) > I think the first step towards evaluating the suitability of gstreamer across > all platforms should be to get it working, reviewed and landed. Then we CSM see > how well it works on other platforms and codecs. > > What would be the plan got other platforms lilt os/2, aix and solaris? Do they > have gstreamer ports? GStreamer has support for Windows, UNIX OSes (such as Mac OS X, Solaris, AIX, *BSD, etc.), and Linux. For Windows, the binaries haven't been updated in some time, but it still does build for Windows. Opera is using it and they are providing source code of their modifications: http://sourcecode.opera.com/gstreamer/ For Mac, I haven't really seen any builds of GStreamer for Mac other than in Songbird, but it is because of Songbird that I know GStreamer does work on Mac OS X. It should be quite obvious that GStreamer works quite well on Linux, given the numerous GNOME applications that use it. I have observed support for GStreamer on Solaris/OpenSolaris and *BSD, but I haven't used other UNIX OSes. However, an IBM document about GStreamer indicates that it is supported on AIX: http://www.ibm.com/developerworks/aix/library/au-gstreamer.html As far as I know, GStreamer isn't supported on OS/2 currently. I have already mentioned this before, but I'm reiterating for the sake of completeness: GStreamer has a DirectShow bridge for Windows, and a QuickTime bridge for Mac OS X, which allows it to use A/V input and output devices as well as codecs from the aforementioned frameworks within GStreamer and all GStreamer enabled applications.
Spamming this bug with advocacy comments is actually going to make it harder to track the technical issues, which will make it harder to get GStreamer support landed. I don't think that's what you want. I suggest the mozilla.dev.tech.platform newsgroup.
Comment 115•15 years ago
|
||
(In reply to comment #102) <SNIP> > > If you use xvimagesink, you won't need the ffmpegcolorspace. At least on the > N900 I would strongly advise to not use it unless needed. What formats can > FennecVideoSink consume? Does it do software composition? The FennecVideoSink is a "pass-through" element, which has the purpose of moving data out of the GStreamer pipeline and into the MediaDecoder - it doesn't have any intelligent handling of the data (actually wondering if AppSink would be more appropriate). Its caps is set to: GST_VIDEO_CAPS_BGRx as this is what the MediaDecoder expects the data in (this might be changing btw). > > On the audioside: > Remove volume and use volume property on pulseaudio (that is on N900 I would > suggest to configure pulsesink as the audiosink and not use gconfaudiosink - > maybe a configure option) Actually gconfaudiosink doesn't work pr. default on the N900 - so it falls back to using AutoAudioSink. Do you think this gives any performance penalties? - The reasoning behind it is: 1) If the system has a GStreamer configuration, then use that one (gconfaudiosink) 2) If there is no configuration or the audio stream can't link with it, then use the best sink (AutoAudioSink) > remove audioconvert, pulse can take all raw formats I don't assume all platforms will have a pulse audio sink? (at least the N810 didn't pr default) We could make it dependent on the formats supported by the actually sink used... > > you might not even need the queue in mAudioBin. I tried removing the queue, it meant I saw a tendency for the fps to go down, but more importantly the audio started to get choppy: queue+convert+volume+autoaudiosink - 1.61 fps autoaudiosink - 1.58 fps (choppy audio) queue +autoaudiosink - 1.69 fps (only two measurements) queue +volume+autoaudiosink - 1.63 fps queue+convert +autoaudiosink - 1.67 fps (The above measurements are an average of 3 runs on a ~44s video - the biggest difference I saw between measurements in a given setup was < 0.05 fps) > > Otherwise this is fine. One small question though. Do you need the app-src, you > can't you use gst_element_make_from_uri() and let gst fetch the data directly? As far as I understand using gst_element_make_from_uri() would mean bypassing the caching, security etc. that Firefox takes care of?
(In reply to comment #115) > (In reply to comment #102) > > Otherwise this is fine. One small question though. Do you need the app-src, > > can't you use gst_element_make_from_uri() and let gst fetch the data directly? > > As far as I understand using gst_element_make_from_uri() would mean bypassing > the caching, security etc. that Firefox takes care of? Right. HTTP caching, cookies, HTTP auth, HTTPS, HTML5 offline appcache, stream throttling, media cache ... they all depend on Gecko handling the load.
Comment 117•15 years ago
|
||
(In reply to comment #116) > > As far as I understand using gst_element_make_from_uri() would mean bypassing > > the caching, security etc. that Firefox takes care of? > > Right. HTTP caching, cookies, HTTP auth, HTTPS, HTML5 offline appcache, stream > throttling, media cache ... they all depend on Gecko handling the load. On the other hand, that could be some unexpected overhead. At least a benchmark using gst_element_make_from_uri() could be worth doing, to get an idea.
That might provide useful data, but in the long run we have to use Gecko to do the laod.
Comment 119•15 years ago
|
||
I was not suggesting to actually use it. Only to get an idea if Gecko adds a significant overhead on this part. If it appears it doesn't, at least we'll know there's nothing to be done on that part in Gecko.
Comment 120•15 years ago
|
||
(In reply to comment #115) > (In reply to comment #102) > <SNIP> > > > > If you use xvimagesink, you won't need the ffmpegcolorspace. At least on the > > N900 I would strongly advise to not use it unless needed. What formats can > > FennecVideoSink consume? Does it do software composition? > > The FennecVideoSink is a "pass-through" element, which has the purpose of > moving data out of the GStreamer pipeline and into the MediaDecoder - it > doesn't have any intelligent handling of the data (actually wondering if > AppSink would be more appropriate). Its caps is set to: GST_VIDEO_CAPS_BGRx as > this is what the MediaDecoder expects the data in (this might be changing btw). > This is not good. If you force RGB caps you request colorspace conversion from ffmpegcolorspace. First ffmpegcolorspace is adding setup overhead (thats why the n900 mediaplayer is avoiding it). 2nd the colorspace conversion is not even neon optimized. 3rd the colorspace conversion is doing a memory to memory operation. Also FenencVideoSink is doing another copy in gst_fennecvideosink_put(). If there is an alternative to SetRGBData() to get the location, it would be better to implement pad_alloc() so that the colorspace converter can write the video into the target buffer directly. Please remove these from gst_fennecvideosink_put() + gst_buffer_ref(aFennecvideo); + gst_buffer_unref(aFennecvideo); > > > > On the audioside: > > Remove volume and use volume property on pulseaudio (that is on N900 I would > > suggest to configure pulsesink as the audiosink and not use gconfaudiosink - > > maybe a configure option) > > Actually gconfaudiosink doesn't work pr. default on the N900 - so it falls back > to using AutoAudioSink. Do you think this gives any performance penalties? Only marginal ones from the deep bin nesting. > - > The reasoning behind it is: > > 1) If the system has a GStreamer configuration, then use that one > (gconfaudiosink) > > 2) If there is no configuration or the audio stream can't link with it, then > use the best sink (AutoAudioSink) > > > remove audioconvert, pulse can take all raw formats > > I don't assume all platforms will have a pulse audio sink? (at least the N810 > didn't pr default) We could make it dependent on the formats supported by the > actually sink used... > > > > > you might not even need the queue in mAudioBin. > > I tried removing the queue, it meant I saw a tendency for the fps to go down, > but more importantly the audio started to get choppy: > > queue+convert+volume+autoaudiosink - 1.61 fps > autoaudiosink - 1.58 fps (choppy audio) > queue +autoaudiosink - 1.69 fps (only two measurements) > queue +volume+autoaudiosink - 1.63 fps > queue+convert +autoaudiosink - 1.67 fps > > (The above measurements are an average of 3 runs on a ~44s video - the biggest > difference I saw between measurements in a given setup was < 0.05 fps) > Okay, this tells that audio is not the big cpu consumer anyway as even dropping convert and volume does not help much. > > > > Otherwise this is fine. One small question though. Do you need the app-src, you > > can't you use gst_element_make_from_uri() and let gst fetch the data directly? > > As far as I understand using gst_element_make_from_uri() would mean bypassing > the caching, security etc. that Firefox takes care of? As others commented, it might be worth trying it. Anyway, I belive the problem is the video rendering part. Whatever happens behind fennecvideosink, if you can make that accept YUV data and if you can avoid the memcpy you should see a big performance gain. On n900 memory bandwidth could be better and doing 2 copies of raw video buffers in a row is definitely impacting the fps badly. Thanks for testing all the suggestions!
Comment 121•15 years ago
|
||
One more comment. gst_fennecvideosink_buffer_alloc() is nonsense. Please disable that as it is. Providing a buffer_alloc only makes sense if you *don't need to do a new[] in there. Like if you have a pool of lets say 4 buffers and recycle them. It makes even more sense if those buffers come from whatever composes the final view.
It would probably be best to benchmark in a standalone XUL app instead of Fennec, for now. We are setting up infrastructure so that the YUV conversion can be done in hardware and the resulting browser window be composited together with OpenGL. See bug 538323.
Comment 123•15 years ago
|
||
(In reply to comment #119) > I was not suggesting to actually use it. Only to get an idea if Gecko adds a > significant overhead on this part. If it appears it doesn't, at least we'll > know there's nothing to be done on that part in Gecko. The measured results are (playing back from local file on the device): app_src : 1.62 fps gst_element_make_from_uri : 1.31 fps gst_element_make_from_uri+queue: 1.65 fps NOTE: I have changed the fps calculating algorithm slightly (more correct now) - this is the reason why the "baseline" has moved from 1.61 fps to 1.62 fps. I'm continuing trying your other suggestions.
Comment 124•15 years ago
|
||
(In reply to comment #120) <SNIP> > As others commented, it might be worth trying it. Anyway, I belive the problem > is the video rendering part. Whatever happens behind fennecvideosink, if you > can make that accept YUV data and if you can avoid the memcpy you should see a > big performance gain. On n900 memory bandwidth could be better and doing 2 > copies of raw video buffers in a row is definitely impacting the fps badly. For the sole purpose of testing the penalty of having the YUV->RGB conversion done by ffmpegcolorspace - I modified the fennecvideosink to accept YUV and pushing this unmodified to the screen (the output on screen is more or less nonsense) as the YUV is 16-bit pr pixel and the SetRGBData expects 32-bit I ran the test with half the normal width (so the measurements can't be compared to the others that I have posted): with ffmpegcolorspace: 2.44 fps without ffmpegcolorspace: 2.77 fps So a significant difference > 10% but not like earth shaking for reference the original video is 25 fps
Comment 125•15 years ago
|
||
(In reply to comment #124) > (In reply to comment #120) > <SNIP> > > As others commented, it might be worth trying it. Anyway, I belive the problem > > is the video rendering part. Whatever happens behind fennecvideosink, if you > > can make that accept YUV data and if you can avoid the memcpy you should see a > > big performance gain. On n900 memory bandwidth could be better and doing 2 > > copies of raw video buffers in a row is definitely impacting the fps badly. > > For the sole purpose of testing the penalty of having the YUV->RGB conversion > done by ffmpegcolorspace - I modified the fennecvideosink to accept YUV and > pushing this unmodified to the screen (the output on screen is more or less > nonsense) as the YUV is 16-bit pr pixel and the SetRGBData expects 32-bit I ran > the test with half the normal width (so the measurements can't be compared to > the others that I have posted): > > with ffmpegcolorspace: 2.44 fps > without ffmpegcolorspace: 2.77 fps > > So a significant difference > 10% but not like earth shaking > > for reference the original video is 25 fps Whats the resolution of the video? Could you also modify FennecVideosink to just drop the received buffer (gst_buffer_unref()) right away (so that nothing is seen). If that is not changing the numbers significantly then it is not the rendering.
Comment 126•15 years ago
|
||
We know rendering to the screen is slow, and the layers work will make it massively better. I wouldn't spend too much time measuring it at this point.
Comment 127•15 years ago
|
||
(In reply to comment #125) <SNIP> > Whats the resolution of the video? Could you also modify FennecVideosink to > just drop the received buffer (gst_buffer_unref()) right away (so that nothing > is seen). If that is not changing the numbers significantly then it is not the > rendering. The resolution is: 360 x 288 It's an excellent idea to test the framerate without actually redrawing the screen - now why didn't I think of that? - anyway the results are: Without drawing : 22.84 fps Without drawing and without ffmpegcolorspace: 23.54 fps (In reply to comment #126) > We know rendering to the screen is slow, and the layers work will make it > massively better. I wouldn't spend too much time measuring it at this point. I understand your worry - but without measuring it we wouldn't know if the rendering was the main problem for the GStreamer integration. If something else took the resources there would be less frames to render, hence a slow rendering would have less of an impact. Now having the above measurement, we can't say that the rendering is the only thing slowing it down, but we can say that the rendering is the main thing we have found so far and that the color conversion is not that big a problem < 5% *)- what could be interesting to see now is how many frames per second that the current rendering engine can handle - or more ideally how big a % of the time is spend decoding and how much is spend moving the decoded frames to the display, something that I don't know how to measure on the N900? *) I have noted the difference between the < 5% here and the > 10% impact measured at the low frame-rate but can't come with a definitive explanation of why. I can theorize that it has to do with the system at times during playback is limited by the frame-rate of 25 fps of the video rather than the available resources, but I can't prove it at this point (btw I'm not looking to prove it either).
Comment 128•15 years ago
|
||
Could somebody update the patch to work with the Firefox 3.6 source tree? The current patch fails to be applied to it.
See Also: → https://launchpad.net/bugs/412647
Comment 129•15 years ago
|
||
Patch updated to tot: Applies on top of (mozilla-central): changeset: 37404:97745a2b2de9 tag: tip user: DΓ£o Gottwald <dao@mozilla.com> date: Fri Jan 22 09:44:16 2010 +0100 summary: Bug 482065 - Allow rearranging of thumbnails in the all tabs panel. r=enn (mobile): changeset: 1318:0f1676f43e39 tag: tip user: Brad Lassey <blassey@mozilla.com> date: Thu Jan 21 12:48:08 2010 -0500 summary: bug 537181 - Add required prefs and set proper config options to support app updating r=mfinkle a=stuart
Attachment #406209 -
Attachment is obsolete: true
Comment 130•15 years ago
|
||
(In reply to comment #128) > Could somebody update the patch to work with the Firefox 3.6 source tree? The > current patch fails to be applied to it. Sorry can't help you there, but have just updated it to something more recent.
Comment 131•15 years ago
|
||
I have a couple questions about this implantation: - Will this support using GST plugins for output (i.e. using alsa/oss/puls/etc. for audio and gl/xvideo/XvMC/VDPAU/etc. for video)? Or will it only be used for demuxing and decoding? - Will it supersede the already built-in OGG/Theora/Vorbis/etc. support if available (there are GST plugins)? This is really only meaningful if GST output plugins are used.
Comment 132•15 years ago
|
||
(In reply to comment #131) > I have a couple questions about this implantation: > > - Will this support using GST plugins for output (i.e. using alsa/oss/puls/etc. > for audio and gl/xvideo/XvMC/VDPAU/etc. for video)? Or will it only be used for > demuxing and decoding? The current integration, first tries "gconfaudiosink" (I believe that means, what ever the users system is configured to use) and in case that hasn't been setup or can't link with the specific audio stream it uses "autoaudiosink" which will let the GStreamer library choose the audio sink. > - Will it supersede the already built-in OGG/Theora/Vorbis/etc. support if > available (there are GST plugins)? This is really only meaningful if GST output > plugins are used. No, the system will first try if any of the build in codecs can handle the decoding, if they don't support the format, then GStreamer will try to decode it (using "decodebin2"), the integration does not limit or control which decoder the decodebin2 chooses, so if there is a GStreamer decoder installed in the system it should be a candidate. Note that as you are probably aware: Initially this integration is only meant for Maemo devices running Fennec - and a discussion of suggestion for other platforms should take place in the newsgroups and not here.
Comment 133•15 years ago
|
||
(In reply to comment #132) > The current integration, first tries "gconfaudiosink" (I believe that means, > what ever the users system is configured to use) and in case that hasn't been > setup or can't link with the specific audio stream it uses "autoaudiosink" > which will let the GStreamer library choose the audio sink. Sorry, I should have looked closer at the implementation. I noticed that the video is being handled with a Fennec video sink, which Roc mentioned would be composited and output as OpenGL. While this is probably much better than the current software rendering, I think that the use of GST's video sinks (i.e. a more direct path to hardware accelerated video). I think that would be a definite boost to the video watching experience as well as XULRunner's performance and resource usage. This should definitely be possible, as many GTK/Qt media applications can do this. Though, at this point, I'm not sure how useful it would be to Fennic. > No, the system will first try if any of the build in codecs can handle the > decoding, if they don't support the format, then GStreamer will try to decode > it (using "decodebin2"), the integration does not limit or control which > decoder the decodebin2 chooses, so if there is a GStreamer decoder installed in > the system it should be a candidate. Again, not sure how pertinent this is to Fennic, but on other systems this could lead to undesirable outcomes. IIRC XULRunner only supports ALSA (on Linux). If a user configured GST to use pulseaudio, different audio back-ends would be used for different media. The same would be true if native GST video sinks were ever used. IMHO if GST is supported (and initialized) on a platform, it should take precedence. > Note that as you are probably aware: Initially this integration is only meant > for Maemo devices running Fennec - and a discussion of suggestion for other > platforms should take place in the newsgroups and not here. I was not aware of that. That seems like an odd decision. It seems like this would get a much larger testing audience if it were on a first class platform (i.e. Linux). Another thing I forgot to ask. Does this cover GST support for the Audio element as well? Also, forgive my ignorance, but where is the newsgroup where this should be discussed?
Comment 134•15 years ago
|
||
(In reply to comment #133) > (In reply to comment #132) > > The current integration, first tries "gconfaudiosink" (I believe that means, > > what ever the users system is configured to use) and in case that hasn't been > > setup or can't link with the specific audio stream it uses "autoaudiosink" > > which will let the GStreamer library choose the audio sink. > > Sorry, I should have looked closer at the implementation. I noticed that the > video is being handled with a Fennec video sink, which Roc mentioned would be > composited and output as OpenGL. > > While this is probably much better than the current software rendering, I think > that the use of GST's video sinks (i.e. a more direct path to hardware > accelerated video). I think that would be a definite boost to the video > watching experience as well as XULRunner's performance and resource usage. If we look at the raw performance, then it will probably be hard to beat a solution where everything is kept inside GStreamer, with a for a given platform optimized GStreamer video sink as you suggest. However it will probably simplify things a bit if all the video output from the different decoders are done in the same way (I guess the solution Roc has told you about). It is relatively simple to change the output sink in the integration I have done, so when we have the generic solution it should be easy to measure and compare the performance of a pure GStreamer solution and one where we have a more generic way of displaying the individual video frames to see if it's worth the extra work. <SNIP> > Again, not sure how pertinent this is to Fennic, but on other systems this > could lead to undesirable outcomes. IIRC XULRunner only supports ALSA (on > Linux). If a user configured GST to use pulseaudio, different audio back-ends > would be used for different media. If it is at one time decided that we should use GStreamer on other platforms, the requirements for the integration will change, and hence the solution might change... <SNIP> > Also, forgive my ignorance, but where is the newsgroup where this should be > discussed? I think mozilla.dev.platform (server: news.mozilla.org) was suggested previously ? (where btw I would be happy to comment on your valid points about where this should be used which, I have left out above)
Comment 135•15 years ago
|
||
(In reply to comment #121) > One more comment. gst_fennecvideosink_buffer_alloc() is nonsense. Please > disable that as it is. Providing a buffer_alloc only makes sense if you *don't > need to do a new[] in there. Like if you have a pool of lets say 4 buffers and > recycle them. It makes even more sense if those buffers come from whatever > composes the final view. Sorry missed giving an answer to this one previously - the reason for using the buffer_alloc function is that ownership of the buffer needs to be passed on to the media framework, which will use delete [] to release the memory, hence it needs to be allocated with new [] - GStreamer uses alloc/free which are not compatible with new/delete.
Comment 136•15 years ago
|
||
(In reply to comment #135) > (In reply to comment #121) > > One more comment. gst_fennecvideosink_buffer_alloc() is nonsense. Please > > disable that as it is. Providing a buffer_alloc only makes sense if you *don't > > need to do a new[] in there. Like if you have a pool of lets say 4 buffers and > > recycle them. It makes even more sense if those buffers come from whatever > > composes the final view. > > Sorry missed giving an answer to this one previously - the reason for using the > buffer_alloc function is that ownership of the buffer needs to be passed on to > the media framework, which will use delete [] to release the memory, hence it > needs to be allocated with new [] - GStreamer uses alloc/free which are not > compatible with new/delete. Btw, I know of the free_func member of the gst_buffer type - but it was not introduced until 0.10.22 which is more recent than our current build chain for Fennec supports and was made with the support of the N810 is mind (which has now been dropped as far as GStreamer video is concerned), hence the alternative "risky" way of handling it that currently exists in the code. I'm considering alternatives...
(In reply to comment #133) > Sorry, I should have looked closer at the implementation. I noticed that the > video is being handled with a Fennec video sink, which Roc mentioned would be > composited and output as OpenGL. ... soon, hopefully, with the layer system, but not yet ... > While this is probably much better than the current software rendering, I > think that the use of GST's video sinks (i.e. a more direct path to hardware > accelerated video). I think that would be a definite boost to the video > watching experience as well as XULRunner's performance and resource usage. > > This should definitely be possible, as many GTK/Qt media applications can do > this. It's essential for browser <video> rendering to participate fully in the browser's rendering pipeline. This means that we need to be able to draw HTML content over the <video> (including a Web page's video controls --- even the browser's video controls --- and arbitrary transparent/translucent content), we need to be able to clip <video> to arbitrary rectangles or even paths (using SVG's clip-path), we need to be able to transform <video> using CSS and SVG transforms, we need to be able to apply arbitrary pixel operations using SVG filters, we need to be able to draw video to an HTML5 <canvas>, and we need to be able to take snapshots of <video> frames for printing and print preview. Can you do all that with GStreamer video sinks? I guess we could if they all support readback. > Another thing I forgot to ask. Does this cover GST support for the Audio > element as well? Yes, it should. > Also, forgive my ignorance, but where is the newsgroup where this should be > discussed? mozilla.dev.platform on news.mozilla.org would be a good place.
The plan in bug 538323 for the built-in Theora codec boils down to, for a GL layers backend: decode Theora to a YUV layer and then, if the layer is not nested in a tricky container (clip-path, filters, transforms, etc), composite it together with the other layers (including separate layer(s) for content over the video) using GL, with YUV conversion happening somewhere along the way, possibly only at draw time using a GL shader, or possibly earlier using bc-cat texture streaming on the N900 for example. If the video is in a container the layer system can't handle yet (clip-path, filters, transforms, etc), we'll have to read back RGB data from the video layer and draw that using cairo. That will be slow, but we'll incrementally fix the slowness by teaching layers how to handle the various container effects. It seems to me that the easiest way to integrate GStreamer here is to get YUV buffers from GStreamer so that we can reuse that rendering infrastructure. The next easiest way would be to get back a GL texture that the GL compositor can use. In the general case, where we have to composite video together with other layers, I think we'll need the video to be available as something equivalent to a GL texture.
Comment 139•15 years ago
|
||
--- Comment #132 from Mike Kristoffersen <mkristoffersen@mozilla.com> 2010-01-25 00:27:08 > Note that as you are probably aware: Initially this integration is only meant > for Maemo devices running Fennec - and a discussion of suggestion for other > platforms should take place in the newsgroups and not here. The bug summary/title doesn't say anything about this being specific to Maemo or Fennec. I'd certainly argue that the reasons it would be a good idea in Maemo on Fennec would certainly apply outside that platform, and that the bug/feature/task tracking system here is the perfect place for that discussion. Even it's really not, the bug should be updated to indicate the narrow scope of the discussion here.
Comment 140•15 years ago
|
||
(In reply to comment #139) As directed, this conversation belongs in the newsgroups, not in the bug.
Comment 141•15 years ago
|
||
(In reply to comment #136) > (In reply to comment #135) > > (In reply to comment #121) > > > One more comment. gst_fennecvideosink_buffer_alloc() is nonsense. Please > > > disable that as it is. Providing a buffer_alloc only makes sense if you *don't > > > need to do a new[] in there. Like if you have a pool of lets say 4 buffers and > > > recycle them. It makes even more sense if those buffers come from whatever > > > composes the final view. > > > > Sorry missed giving an answer to this one previously - the reason for using the > > buffer_alloc function is that ownership of the buffer needs to be passed on to > > the media framework, which will use delete [] to release the memory, hence it > > needs to be allocated with new [] - GStreamer uses alloc/free which are not > > compatible with new/delete. > > Btw, I know of the free_func member of the gst_buffer type - but it was not > introduced until 0.10.22 which is more recent than our current build chain for > Fennec supports and was made with the support of the N810 is mind (which has > now been dropped as far as GStreamer video is concerned), hence the alternative > "risky" way of handling it that currently exists in the code. I'm considering > alternatives... Sorry from my side too. Did not saw that gst_fennecvideosink_put() is taking ownership of passed buffers. Not sure how easy this can be changed, but if you can change that, we could enable buffer pools on the dspdecoder. This would reduce the mmap overhead as the dsp would reuse the very same buffers. Effects would not be dramatic though ...
Comment 143•15 years ago
|
||
Note when building for Meamo you need at least the FREMANTLE SDK (see: http://developer.mikek.dk/#post50 for how to set it up), this is due to use of GStreamer features that aren't found in our normal Maemo build chain. I'll be off-line for the next couple of weeks, if anyone wants to do work on this, searching for TODO's will give an initial starting point - the main areas that need work are: 1) Ensuring thread safety, it's currently being re-done, half-done, faulty, etc. 2) Unifying the assertions, replace with NS_ABORT_IF_FALSE as appropriate. 3) In case the audio-pipeline doesn't support volume control, do on-off volume control. 4) Call start/stop progress at the correct places 5) Send time-updated events at the correct places 6) Fix seeking 7) Make the patch work for audio only streams 8) Implement missing functions in nsGStreamerLink 9) Fix the assert in: nsPresContext::NotifyInvalidation
Attachment #422961 -
Attachment is obsolete: true
Comment 144•15 years ago
|
||
(In reply to comment #129) > Created an attachment (id=422961) [details] > Patch updated to TOT as described in the comment section Hi guys, A quick introduction, I'm a developer for fremantle browser. In our microb configuration we have that patch (id=422961) applied and with that we run into a nasty deadlock situation with a test page (http://jumi.lut.fi/~vanhoja/www2/misc/audio_dom_play_pause2.html). The situation basically was that a nsGStreamerInputBin was blocked in mNsMediaStream->Read while a shutdown was done. Calling mGStreamerState.Shutdown() in nsGStreamerPipeline::Shutdown resulted into a deadlock because of that unfinished read. I'll attach a patch how I ensured this kind of situation would not happen.
Comment 145•15 years ago
|
||
Comment 146•15 years ago
|
||
Seems like the makefile slipped out of my previous patch, it will take some time to generate an updated patch, so in the meantime this makefile goes into: .../content/media/gstreamer
Comment 147•15 years ago
|
||
FYI: latest Opera 10.51 Linux builds now have GStreamer backend for HTML5 video. Perhaps a parity-opera flag for this bug now?
Comment 148•15 years ago
|
||
The identity of the backend seems irrelevant from an end user point of view (new features supported, or features lost, are what's relevant), so I don't think this should be parity-anything.
Comment 149•15 years ago
|
||
Considering it seems that there is no way that Mozilla will support any other format than Ogg Vorbis and Theora in the **** Ogg container without it being able to disclaim support for other formats through something like GStreamer, it definitely is important to make it as likely as possible to get GStreamer support integrated. If Mozilla doesn't want to support anything other than Theora and Vorbis in the official builds, then you could just include those codecs with GStreamer bundled with the Firefox builds. Also, WHY shouldn't we take advantage of the work of the GStreamer people to offer high quality codec and container support? GStreamer is a lot better than the current system for Firefox...
Comment 150•15 years ago
|
||
(In reply to comment #149) > If Mozilla doesn't want to support anything other than Theora and Vorbis in the > official builds, then you could just include those codecs with GStreamer > bundled with the Firefox builds. GStreamer is GPL right? The GPL is not more permissive than the MPL/GPL/LGPL try-license Mozilla currently uses, so we couldn't ship it.
Comment 152•15 years ago
|
||
Yes, GStreamer is LGPL, but many of the plugins are GPL. And regardless, I doubt that Mozilla would include any of the interesting plugins (i.e. gst-plugins-ffmpeg). So users would still need to install plugin packages. Most Linux distros already have it in their repositories, so I don't see any need to ship it on those platforms. Packaging is only really an issue on Windows systems, if Mozilla even wanted to include GST support on Windows. Builds are available from the GStreamer WinBuilds project ( http://www.gstreamer-winbuild.ylatuya.es ), so I don't think it's unreasonable to ask users to download and install it themselves. I'm not really sure of the state of GST packages on MaxOSX, though it is a supported platform.
Comment 153•15 years ago
|
||
(In reply to comment #152) > Yes, GStreamer is LGPL, but many of the plugins are GPL. And regardless, I > doubt that Mozilla would include any of the interesting plugins (i.e. > gst-plugins-ffmpeg). So users would still need to install plugin packages. > > include GST support on Windows. Builds are available from the GStreamer > WinBuilds project ( http://www.gstreamer-winbuild.ylatuya.es ), so I don't > think it's unreasonable to ask users to download and install it themselves. From GStreamer WinBuilds we also provide an LGPL build, which includes an LGPL version of ffmpeg (built whithout the -gpl option). This GStreamer build can decode many of the most common formats, since ffmpeg decoders are part of ffmpeg itself and don't rely on external GPL libraries, which are the ones triggering the GPL license. Therefore Mozilla could ship GStreamer binaries on Windows with (almost) full decoding support. The only problem is the H264 decoder, for the patents issues, but ffmpeg can also be built without this decoder, if needed.
Comment 154•15 years ago
|
||
(In reply to comment #152) > Yes, GStreamer is LGPL, but many of the plugins are GPL. And regardless, I > doubt that Mozilla would include any of the interesting plugins (i.e. > gst-plugins-ffmpeg). So users would still need to install plugin packages. > > include GST support on Windows. Builds are available from the GStreamer > WinBuilds project ( http://www.gstreamer-winbuild.ylatuya.es ), so I don't > think it's unreasonable to ask users to download and install it themselves. From GStreamer WinBuilds we also provide an LGPL build, which includes an LGPL version of ffmpeg (built without the -gpl option). This GStreamer build can decode many of the most common formats, since ffmpeg decoders are part of ffmpeg itself and don't rely on external GPL libraries, which are the ones triggering the GPL license. Therefore Mozilla could ship GStreamer binaries on Windows with (almost) full decoding support. The only problem is the H264 decoder, for the patents issues, but ffmpeg can also be built without this decoder, if needed.
Comment 155•15 years ago
|
||
I would request all involved to not discuss and plan ahead of time. Let the technical problems be solved first, then the decision of including gstreamer support in the default builds can be discussed. As such it's only adding noise to this bug (which is anyway not for non-technical discussion). Thank you.
I want to point out here that Mozilla doesn't want to ship libraries that are LGPL-only. We only ship libraries that are at least LGPL and MPL compatible. I don't really like this policy, but that's what it is. If you want to debate it, go to the mozilla.governance newsgroup, don't bother debating it here.
Comment 157•15 years ago
|
||
(In reply to comment #156) > I want to point out here that Mozilla doesn't want to ship libraries that are > LGPL-only. We only ship libraries that are at least LGPL and MPL compatible. > > I don't really like this policy, but that's what it is. If you want to debate > it, go to the mozilla.governance newsgroup, don't bother debating it here. Why ship the libraries ? Do you ship GTK+ libraries with official mozilla builds for example ?
Comment 158•15 years ago
|
||
(In reply to comment #157) > Why ship the libraries ? Do you ship GTK+ libraries with official mozilla > builds for example ? No.
Comment 159•15 years ago
|
||
IMHO, I would rather use an unstable gstreamer backend that can be fixed with time, than stick to the **** flash implementation right now on linux for video playback. For Windows, I think it is a bad idea to use gstreamer, since Windows have DirectShow which is native and has an architecture like gstreamer if I am not wrong. Plus it would be easier to integrate with Direct2D hardware acceleration which is now in trunk.
Description
•