Closed Bug 1056458 Opened 7 years ago Closed 7 years ago

Expose automation events on the server for the web audio editor

Categories

(DevTools Graveyard :: Web Audio Editor, defect)

x86
macOS
defect
Not set
normal

Tracking

(Not tracked)

RESOLVED FIXED
Firefox 37

People

(Reporter: ehsan.akhgari, Assigned: jsantell)

References

Details

Attachments

(1 file, 3 obsolete files)

It would be awesome if we built a visualizer that showed the value of an AudioParam over time.  The Web Audio spec defines how they are to be interpreted, and based on those rules we should be able to build a graph for the value changes over time.

If you agree, I can add a ChromeOnly API which gives you access to these automation events, because the content-visible AudioParam interface doesn't expose them.

The algorithm to compute the value at a given point in time is kind of complicated, but it's already implemented here in C++: <http://mxr.mozilla.org/mozilla-central/source/content/media/AudioEventTimeline.h#263>.  We should be able to easily rewrite that into JS.  However that is suboptimal from a performance view, since that algorithm is not efficient for computing a large range of adjacent times.  So perhaps we may want to develop an algorithm more suited towards filling ranges of times for this.

Jason, what do you think?
Flags: needinfo?(jsantell)
I was planning on hacking on this during the dev tools work week at the end of September :D My tenative strategy was to patch into the automation method calls and tracking them in the tools, and ultimately display something like in the docs: http://webaudio.github.io/web-audio-api/images/audioparam-automation1.png

Naively, I was thinking we could easily compute the curves client side, but once there are multiple points in the automation, overlapping and mixing, might be more difficult. Another issue might be, as `currentTime` is constantly increasing, even without new automation points, is everything just shifting leftwards, or should it only render up until that point, etc..

Made a bug previously for this, and possibly listing automation points (types and time), with possibility for graph drawing. (bug 1007876)

Wouldn't it be more performant to draw the "nodes" in a line graph, and interpolate what kind of curve it should be? But again, that might be naive..
Flags: needinfo?(jsantell)
(In reply to Jordan Santell [:jsantell] [@jsantell] [PTO until September 4th] from comment #1)
> I was planning on hacking on this during the dev tools work week at the end
> of September :D My tenative strategy was to patch into the automation method
> calls and tracking them in the tools, and ultimately display something like
> in the docs:
> http://webaudio.github.io/web-audio-api/images/audioparam-automation1.png

Yeah, I think you should be able to intercept those methods...  Anyways, let me know if that doesn't work out.

> Naively, I was thinking we could easily compute the curves client side, but
> once there are multiple points in the automation, overlapping and mixing,
> might be more difficult.

You can definitely compute these curves client side.  The rules are pretty complicated and it took me a lot of effort to implement this properly, but it's totally doable (though not an afternoon project.)

> Another issue might be, as `currentTime` is
> constantly increasing, even without new automation points, is everything
> just shifting leftwards, or should it only render up until that point, etc..

Each AudioParam effectively has a conceptual curve internally and nothing shifts as time passes.  It can be modified at runtime obviously in response to the currentTime changing through AudioParam APIs.

> Made a bug previously for this, and possibly listing automation points
> (types and time), with possibility for graph drawing. (bug 1007876)

Feel free to dupe this one against that bug if needed.

> Wouldn't it be more performant to draw the "nodes" in a line graph, and
> interpolate what kind of curve it should be? But again, that might be naive..

That is basically the algorithm, yes, but some types of events affect the curves before them and some affect the curves after them so you should basically look at both the next and previous event.  I think the spec should be clear on those rules, they're just complicated.

Note that the approach in the code in comment 0 is a bit different, we basically first find what the previous and next nodes are and then do the interpolation like that, and don't use the knowledge from the previous round so we end up doing the same computation over and over again for the same next/prev event pairs.  That is what I was mentioning as a potential improvement.
Look into `dom/media/webaudio/AudioEventTimeline.h`, and see about pulling this into a JS module for client side.

Potential complexities:
* events added on curves
* order of adding events

Look at tests in dom/media/webaudio/compiledtest/TestAudioEventTimeline.cpp
Duplicate of this bug: 1007876
Summary: Add a visualizer for AudioParam automation timeline → Expose automation events on the server for the web audio editor
Assignee: nobody → jsantell
Attached patch 1056458-automation-server.patch (obsolete) — Splinter Review
Server side components for rendering automation data. Need to play with CPOW stuff to see if I can get the Tool's API of adding events to also trigger events exposed via the WebAudioActor.
Attached patch 1056458-automation-server.patch (obsolete) — Splinter Review
Fixed the issue where the actor methods (used in tests) were not spawning the call-watcher proxies, so now that's fixed to streamline the code a bit more. Lots of content/e10s/scope hacking, so maybe there's a better solution for this as well.

This is using a module hacked on over the work week: https://github.com/jsantell/web-audio-automation-timeline

Should be bundled up sufficiently, and seems like a cleaner import than our other modules, but obviously looking for thoughts here.
Attachment #8533518 - Attachment is obsolete: true
Attachment #8534039 - Flags: review?(vporof)
Status: NEW → ASSIGNED
The automation timeline module, I forgot to mention, is a JS reimplementation of our C++ web audio implementation, which isn't exposed anywhere, and calculated on every frame tick, via `dom/media/webaudio/AudioEventTimeline.h`. The tests have been ported over in the module as well
Comment on attachment 8534039 [details] [diff] [review]
1056458-automation-server.patch

Review of attachment 8534039 [details] [diff] [review]:
-----------------------------------------------------------------

LGTM, r+ if automation-timeline.js is reviewed and also passed secreview (since it's an external lib).

::: toolkit/devtools/server/actors/utils/automation-timeline.js
@@ +1,5 @@
> +/**
> + * web-audio-automation-timeline - 1.0.0
> + * https://github.com/jsantell/web-audio-automation-timeline
> + * MIT License, copyright (c) 2014 Jordan Santell
> + */

Is this reviewed anywhere? File a separate bug for landing this.
Attachment #8534039 - Flags: review?(vporof) → review+
Depends on: 1112875
Attached patch 1056458-automation-server.patch (obsolete) — Splinter Review
Removed automation-timeline from this patch -- created bug 1112875 for that. Will run through try once that lands.
Attachment #8534039 - Attachment is obsolete: true
Attachment #8538161 - Flags: review+
https://hg.mozilla.org/mozilla-central/rev/6bf93610034b
Status: ASSIGNED → RESOLVED
Closed: 7 years ago
Resolution: --- → FIXED
Whiteboard: [fixed-in-fx-team]
Target Milestone: --- → Firefox 37
Product: Firefox → DevTools
Product: DevTools → DevTools Graveyard
You need to log in before you can comment on or make changes to this bug.