Closed
Bug 1279885
Opened 9 years ago
Closed 9 years ago
(MSE) appendBuffer does not throw QuotaExceededError if SourceBuffer is full
Categories
(Core :: Audio/Video: Playback, defect, P1)
Tracking
()
RESOLVED
INVALID
People
(Reporter: xqq, Unassigned)
References
Details
Attachments
(1 file)
92.35 KB,
image/png
|
Details |
User Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/53.0.2756.0 Safari/537.36
Steps to reproduce:
I implemented an MSE player, which transmux media file into fragmented mp4(s) in a http continuous stream.
The generated mp4 fragments will be appended in sequence by SourceBuffer.appendBuffer() method.
When append speed (determined by network speed) is extermely greater than playback speed, sourcebuffer eviction may be failed (because leading data is still being playback), then the sourcebuffer may transition into buffer full state.
At the same time, mp4 fragments is still being generated and appended.
Actual results:
The progress bar's buffered area never update after sourcebuffer buffer full.
The appendBuffer method is still called cyclicity by player, but no exceptions received, even if the SourceBuffer is full.
Buffered ranges (video.buffered) become scattered and discontinuous, which can be viewed in about:media (see attached picture)
Expected results:
According to Prepare Append Algorithm in MSE spec [https://w3c.github.io/media-source/#sourcebuffer-prepare-append],
SourceBuffer.appendBuffer() method should throw a QuotaExceededError if target sourcebuffer is full, instead of accept sliently.
Also, video.buffered buffered ranges should be a continuous range.
The player's implementation side should be told if the browser's sourcebuffer is full, then the player can abort the fetch-tranxmux-append process.
Severity: normal → major
Component: Untriaged → Audio/Video: Playback
Depends on: MSE
Priority: -- → P1
Product: Firefox → Core
Sorry for my userAgent. I wrote this issue on chrome.
This problem can be reproduced on FireFox 46, Windows 10 / Ubuntu 15.10
Comment 2•9 years ago
|
||
You're reading the spec wrong.
Step 5 "Run the coded frame eviction algorithm."
This will run as per
https://w3c.github.io/media-source/#sourcebuffer-coded-frame-eviction
Which will determine what can be evicted and run the coded frame removal algorithm which per step 4:
"If buffer full flag equals true and this object is ready to accept more bytes, then set the buffer full flag to false."
Now following this we return to the prepare append algorithm which states step 5:
"If the buffer full flag equals true, then throw a QuotaExceededError exception and abort these step."
So, because we've been able to evict data, and that there's space to add new data. The quota exceeded error will. To be fired.
If you continue to append at this stage, you will no longer get a continuous buffered range because data got evicted.
As per spec, and mentioned in the coded frame eviction algorithm:
"Implementations may use different methods for selecting removal ranges so web applications should not depend on a specific behavior. The web application can use the buffered attribute to observe whether portions of the buffered data have been evicted."
The eviction policy we use is first attempt to eject data that we've already played. If none can be evicted, then we will evict data in the future, further than 30s from the current position.
It is up to the player to check on the buffered range after calling appendBuffer to determine if eviction occurred and adjust accordingly.
For what it's worth, I can't think of any browsers doing as you're suggesting, so your player will misbehave rather consistently, regardless of the browser.
Comment 3•9 years ago
|
||
Fwiw, the YouTube MSE conformance test, perform the following verification.
They continuously append data until a gap is detected in the buffered range. From this they calculate the maximum size of a SourceBuffer.
http://yt-dash-mse-test.commondatastorage.googleapis.com/unit-tests/2016.html?timestamp=1465820916804
Comment 4•9 years ago
|
||
Rather than "The quota exceeded error will. To be fired." I meant to write "the quota exceeded error will not be fired"
Thanks for your reply.
Actually, my player works well on Chrome/IE11/Edge. On these platforms, QuotaExceededError will always be throwed on appendBuffer() call if SourceBuffer has been full but no data can be evicted, so I can detect SourceBuffer state directly and abort the transmuxing process.
You said that if none of played data can be evicted, then data in future will be evicted, further than 30s from the current position. In my opinion, this is a strange and incorrect policy. Prepared media data which have not be played yet should not be evicted sliently, they will be played in the near future.
As a player side, I appended the media data for future playback just now, but you evict them soon, without telling me the lack of space. Obviously the player will not reserve any backup source data because fragmented mp4 has been generated & appended to browser. MP4 fragments also will not be reserved, or there will be large memory waste.
Under this strange policy, it seems that the player need to re-fetch the source media data of evicted time range to re-transmuxing, and re-append to that area to ensure seamlessly playback.
But It is hard to fetch media data of arbitrary time range. Not all media file formats/protocols support this, especally in network environment. This will cause dirty implementations.
In short, the eviction policy of FireFox is strange and problematical.
Comment 6•9 years ago
|
||
AFAIK, chrome has the same future data eviction policy. This can be verified in the MSE test I mentioned above. Test is number 20 "dash latency" or 29 "videobuffersize"
We actually had to implement it that way and reduce the time we could throw an error as throwing an error always cause an irrecoverable error with YouTube. And this is the case for many dash player: throw an exception and they will enter into error mode.
The default sourcebuffer size is 100MiB for video and 30MiB for audio (used to be the same for both until recently)
In any case, our implementation is 100% spec compliant.
While I understand your concern, feeding data indefinitely until you can't is a bad approach. You are basing the choice of stream uniquely based on the network speed is often not sufficient. It may well be that the user agent is unable to play the stream without dropping frames etc.
As such, you're better off only loading in advance what you know you will need and be able to play. That is limit the buffer ahead to a few seconds. Not download as much as you can.
I have resolved the issue by lazy loading in player, which limits the buffer ahead to about 3 minutes.
Then the connection will be aborted, and will resume when playback reach near the end of buffered area.
You need to log in
before you can comment on or make changes to this bug.
Description
•