[spawned from bug 1168040] The bipbop-lateaudio has two tracks: - A video track that is 2.4017s long, and which starts at 0s - An audio track that is 1.950476a long, and starts at 1.147982s The last video sample has a time of 2.368333s and a duration of 0.033367s. The last audio sample has a time of 3.052018s and a duration of 0.046440s Our code assumes that our media duration is max(audioduration, videoduration) so in this case: 2.4017s. But when we finish playing this video, our currentTime is 3.098458, and as such our duration is set to 3.098458s What should the duration be? Should we continue playing the audio after reaching the end of the video track? Here is such video: http://people.mozilla.org/~jyavenard/tests/bipbop.html
http://dev.w3.org/html5/spec-preview/media-elements.html#dom-media-duration "The duration attribute must return the time of the end of the media resource, in seconds, on the media timeline." I think the correct duration should be 3.098458s and we should play to the end of media resource no matter which track ends first.
(In reply to JW Wang [:jwwang] from comment #1) > http://dev.w3.org/html5/spec-preview/media-elements.html#dom-media-duration > > "The duration attribute must return the time of the end of the media > resource, in seconds, on the media timeline." > > I think the correct duration should be 3.098458s and we should play to the > end of media resource no matter which track ends first. There's more than one way to see this, and I agree that under some circumstances, it feels like that's the way to do it. In MediaSource however, the spec does state that if you have audio data but not video, you're supposed to stall. Which would make the behaviour inconsistent. There's also the issue on what duration should we report after loadedmetadata. While the audio track is 1.95s long, it does only start at 1.14s. So shouldn't we report then that the media duration is 3s too ? The information on when the audio or video track starts isn't found in the metadata, so we must demux the first frame to get that information. And finally, our mochitest (in particular test_playback.html) expect that the duration read after loadedmetadata match the duration found at the end of playback.
http://dev.w3.org/html5/spec-preview/media-elements.html#event-media-durationchange durationchange is used to update the duration when we have more data to determine the duration more accurately. For MSE, things can get get a little trickier. However for non-MSE case like test_playback.html, we are supposed to get a duration from metadata that matches that of the end of playback.
(In reply to JW Wang [:jwwang] from comment #3) > However for non-MSE case like > test_playback.html, we are supposed to get a duration from metadata that > matches that of the end of playback. in this particular example, this won't happen.
For non-MSE media, the approach that I thought we took was to take the start time to be min(audio_start_time, video_start_time), and the end time to be max(audio_start_time, video_start_time), duration should be (end_time - start_time), and we should play silence or the first/last video frame of the finished/not started stream in order to play across the gap. For MSE media... I'm not sure. Does the spec also say that if we have video but a gap in the audio stream we should also stall?
Component: Audio/Video → Audio/Video: Playback
You need to log in before you can comment on or make changes to this bug.