Video element made with getUserMedia cannot have its audio captured with the Web Audio API

NEW
Unassigned

Status

()

Core
Audio/Video: Playback
P3
normal
2 years ago
a year ago

People

(Reporter: brad, Unassigned)

Tracking

47 Branch
Points:
---

Firefox Tracking Flags

(Not tracked)

Details

Attachments

(1 attachment)

(Reporter)

Description

2 years ago
Created attachment 8722103 [details]
ff-getusermedia-video-audio-bug-test.html

User Agent: Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/48.0.2564.109 Safari/537.36

Steps to reproduce:

Open the attached page, and allow Firefox to capture audio and video from your microphone/camera.  Two video elements will appear... one pre-recorded, and one from your camera.

Tested on v47.0a1.


Actual results:

When the Web Audio API is using the MediaStream from the pre-recorded video, that video's audio will be redirected into the Web Audio API.  If you comment line 32 (`vPreRecorded`) and uncomment line 33 (`vUserMedia`), then reload the page, the video from your camera will have its audio used for the Web Audio API.  However, the audio is not re-routed to the Web Audio API as expected.  It continues to play through the default sound output device, and no audio comes through the node.


Expected results:

No matter what the video's source, I would expect the behavior to remain the same.  That is, when using the video's audio with the Web Audio API, the audio should be re-routed to the Web Audio API and effectively muted from the output.

From https://developer.mozilla.org/en-US/docs/Web/API/AudioContext/createMediaElementSource:

> Note: As a consequence of calling createMediaElementSource(), audio playback from the HTMLMediaElement will be re-routed into the processing graph of the AudioContext.

Updated

2 years ago
Component: Untriaged → WebRTC: Audio/Video
Product: Firefox → Core
This appears to be a bug in how HTMLMediaElement deals with MediaStream sources (note the example doesn't use srcObject = stream, which would be better, but instead src = URL.createObjectURL(stream)).  I imagine this would occur with a stream via stream = vPreRecorded.captureStreamUntilEnded(); it likely has nothing to do with getUserMedia per se.
Status: UNCONFIRMED → NEW
Component: WebRTC: Audio/Video → Audio/Video: Playback
Ever confirmed: true
Randell - is this issue important?
Flags: needinfo?(rjesup)
I'm pretty sure one can work around this (by feeding the gUM stream into WebAudio directly, instead of from the media element), but it may not be obvious to someone running into the problem that such a workaround is needed and exists.

I'd say not important, but also not totally minor/irrelevant.  More the "annoying non-inuitive quirk/thorn-in-one's-side" sort of thing.  Due to habit or cargo-culting webaudio examples that start with a media element as a source, more than a few developers will just feed data in from a random media element; then it becomes "do they become frustrated/quit or do they figure out/find the workaround?"  If I'd coded it this way, and wasn't a MSG/gUM/WebAudio expert, I'm not sure I would figure it out.
Flags: needinfo?(rjesup)
That translates to P2 then.
Priority: -- → P2
Mass change P2 -> P3
Priority: P2 → P3
You need to log in before you can comment on or make changes to this bug.