Open Bug 1222874 Opened 4 years ago Updated 1 year ago
Drop concept of frame's duration in raw and decoded samples
In most playback code, we currently require to know in advance the sample of the decoded frame duration. Almost no container except MP4 has a concept of frame duration, having to determine the frame duration in other containers (e.g. mp3, ogg, webm) causes us to go through hoops such as demuxing ahead to determine the start time of the next frame and gives us problem for the last frame as we can't determine its duration accurately etc... A video frame should be displayed until there's a new one available or when we've reached EOS, regardless of its duration. For audio, we can determine the duration by simply looking at the number of frames inside in the decoded samples block. The handling of the last frame would simply to display it until the last audio samples is actually played. It would likely simplify our code and provide more intuitive behaviour. I'm also hoping this would allow to leverage more code between A/V Playback and WebRTC (which require a frame to be displayed until the next one)
Sounds reasonable. Currently MSG needs to know frame durations, but that should go away in bug 1201363.
4 years ago
Priority: -- → P2
(In reply to Jean-Yves Avenard [:jya] from comment #0) > A video frame should be displayed until there's a new one available or when > we've reached EOS, regardless of its duration. I wonder how long the last video frame should last under this scenario for EOS doesn't carry a timestamp to indicate the end of previous frame.
I assumed it would be until audio stops. we don't know the duration of the last frames accurately for anything but mp4 anyway
Mass change P2 -> P3
Priority: P2 → P3
You need to log in before you can comment on or make changes to this bug.