Closed
Bug 898771
Opened 11 years ago
Closed 11 years ago
Media Recording - The second piece of blob data retrieved from ondataavailable is unplayable media content during a recording with no timeslice
Categories
(Core :: Audio/Video: Recording, defect)
Core
Audio/Video: Recording
Tracking
()
RESOLVED
INVALID
People
(Reporter: jsmith, Unassigned)
Details
STR 1. Go to http://mozilla.github.io/qa-testcase-data/webapi/mediarecorder/index.html 2. Check microphone and hit setup 3. In the console, type mediaRecorderList[0].start(); 4. Wait a few seconds and generate some sound into your mic 5. In the console, type mediaRecorderList[0].requestData(); 6. Wait a few seconds and generate different sounds into your mic 7. In the console, type mediaRecorderList[0].requestData(); 8. Try playing each piece of blob media generated by selecting download data available for each link available Expected Both blobs should playable media content with sounds generated from the timeframe that media recording started from to the time that the requestData call was made. Actual The first piece of blob data is playable, but the second is not. The second piece of blob data reports "video can't be played because the file is corrupt." Note - this will also happen with a third call to requestData as well. It seems that only the first piece of blob data generated from ondataavailable is playable right now, but any other followup ondataavailable events have unplayable media content.
Reporter | ||
Updated•11 years ago
|
Blocks: MediaRecording
Comment 1•11 years ago
|
||
Hi Roc, We only generate the ogg's header on first blob, is it possible to support playback on every blob? However, the Blob object allow slice operation, how can we keep the header on the right position?
Flags: needinfo?(roc)
My understanding of the spec is that this bug is invalid; i.e., individual blobs are not individually decodable in general; they must be concatenated to produce a usable resource. And that's what we implemented. Unfortunately I can't find anything in the spec which says that explicitly. Discussions on the list imply it, and if we had to make each individual Blob playable, that would severely hurt performance if the author requests low-latency dataavailable events. So I'm going to mark this bug INVALID and follow up on the list to get the spec clarified.
Status: NEW → RESOLVED
Closed: 11 years ago
Flags: needinfo?(roc)
Resolution: --- → INVALID
Reporter | ||
Updated•11 years ago
|
No longer blocks: MediaRecording
Reporter | ||
Comment 3•11 years ago
|
||
(In reply to Robert O'Callahan (:roc) (Mozilla Corporation) from comment #2) > My understanding of the spec is that this bug is invalid; i.e., individual > blobs are not individually decodable in general; they must be concatenated > to produce a usable resource. And that's what we implemented. I could see that making sense for the timeSlice use case (e.g. mediaRecorder.start(1000)), as that means we just sliced a bunch of blobs into separate pieces in which only the first that has the ogg header on the first blob. But I'm puzzled why this would impact the no parameters case - each blob sent to ondataavailable is media playback from the time the recording started to when ondataavailable fired. Meaning: 1. start() <-- time = 0 seconds 2. wait five seconds 3. call requestData() <-- time = 5 seconds 4. wait five seconds 5. call requestData() <-- time = 10 seconds [3] should encompass 0 - 5 seconds. [5] should encompass 0 - 10 seconds. Why wouldn't [5] include the ogg header in the blob? > > Unfortunately I can't find anything in the spec which says that explicitly. > Discussions on the list imply it, and if we had to make each individual Blob > playable, that would severely hurt performance if the author requests > low-latency dataavailable events. So I'm going to mark this bug INVALID and > follow up on the list to get the spec clarified. Agreed on the timeSlice case here, although I'd like clarification on why this would impact the no timeSlice case. Won't reopen, but would be curious to know why.
Blocks: MediaRecording
Reporter | ||
Comment 4•11 years ago
|
||
Disregard, I'm incorrect on my above comment. The spec indicates that a new blob is created after requestData is called, the above example would be: [3] encompasses 0 - 5 seconds, [5] encompasses 5 - 10 seconds timeframe, as a new blob was created at the call of requestData.
Reporter | ||
Updated•11 years ago
|
No longer blocks: MediaRecording
Updated•10 years ago
|
Component: Video/Audio → Video/Audio: Recording
You need to log in
before you can comment on or make changes to this bug.
Description
•