(In reply to Matthew Gregan [:kinetik] from comment #2)
I'm concerned this will lead to performance issues as we're now going to serialize every cubeb_stream callback on a single thread. It's simple to revert, though, so we can try it.
It should be fine. most of the time, those thread are not loaded very high, and the stream events are not exactly happening at the same time, since, down the line, there is a mixer that needs to mix everything and do its job, and it's not running on multiple threads. When/if we support multiple output device, maybe it might be different.
I'd prefer to retain the ability to use a multi-thread pool here, so if any upstream code starts depending on being called from a single stable thread that will cause problems for planned improvements in AudioIPC.
There is the hard requirement that an
AudioContext that runs
WASM in an AudioWorklet` is always running it on the exact same thread, because SpiderMonkey relies on TLS. We had to come up with https://searchfox.org/mozilla-central/source/dom/media/GraphRunner.cpp#26 because this is not yet the case.
This is however just for a specific
AudioContext. Ideally, there would be multiple system level streams in the parent, and multiple threads in each child, and there should be a 1:1 relationship. We could certainly put all the high-latency
AudioStream playing to the default device together.