User Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_8_2) AppleWebKit/537.28 (KHTML, like Gecko) Chrome/26.0.1396.1 Safari/537.28 Expected results: Firefox should implement the Web MIDI API (http://webaudio.github.com/web-midi-api/). This W3C specification-in-progress enables the web platform to gain access to MIDI input controllers, such as keyboard, drum and DJ controllers - it also enables access to external and internal synthesizers, such as the built-in software synthesizers on Windows and Mac OS. MIDI connections are extremely common in musical applications, and have become increasingly popular on mobile and tablet devices as well (MIDI is supported by iOS via CoreMIDI, and the iOS App Store has >400 MIDI applications. This has become so popolar that many MIDI keyboard controllers are available with iPad slots.) Support for Web MIDI, along with Web Audio, enables many music production scenarios, from notation entry and playback a la Noteflight.com to live synthesizers (e.g. http://webaudiodemos.appspot.com/midi-synth/index.html, as well as numerous iOS synthesizers from names like Roland, Yamaha and Korg), as well as providing programming interfaces to hardware synthesizers (E.g Roland's JP Synth Editor for iPad, used with their Jupiter-50 and Jupiter-80 synthesizers).
4 years ago
This is much lower priority than Web Audio (and many other things) IMHO.
Chris, thanks for your suggestion. Marking this as a possible future enhancement.
To me this sounds like a perfect target for maybe an intern to take on? Either way, I'm obviously in favor of this. I can also see the benefit of this API in things such as low-end mobile games (when you pay for transfer, MIDI starts sounding surprisingly good).
I'd personally prefer the intern to work on Web Audio first. It is of much higher priority.
I just wanted to vocalize official support regarding how important this feature is. The ability to add MIDI capability to web applications would open up massive new avenues for online music creation, collaboration, entertainment and education. As great as web audio is, the inability to flexibly manipulate it is a big drawback. We've already started to build support for our hardware (the gTar) in web midi through the use of the poly-fill / Jazz Soft midi plug in. The ability to build/deploy music related applications along with the coupling of Web MIDI is extremely exciting. If needed we would be willing to provide support however possible and within our means! Thanks, Idan
(In reply to comment #5) > If needed we would be willing to provide support however possible and within > our means! Are you interested in submitting patches to implement Web MIDI in Gecko?
I'd definitely agree that this API should be implemented in a not too distant future. While it's great that Web Audio is coming along, having this API in parallell would make a world of difference in terms of professional audio on the web. Having gone the long route implementing a musical "keyboard" on the computer keyboard, I can honestly say I wish this was around when we made Jam With Chrome, and I'd love to use it in a lot of coming projects too.
This API is really important, as MIDI is the universal standard for real-time I/O controllers of many kinds (not just music performance; also lighting, mixing, and much more). Supporting MIDI is also the obvious complement to Web Audio API support, allowing the browser to support a full spectrum of applications with real-time musical input from the user.
As the software lead at a MIDI controller manufacturer, we have already charted out new ways of reaching customers and customers reaching each other that uses MIDI in the browser. The potential is overwhelming! There is also a great cultural benefit in preserving and distributing non-fixed media artworks, such as algorithmic composition, many (if not most) of which are composed with MIDI. There is no reason that the only device that can control a browser is a keyboard, and there are millions of devices out there that speak MIDI. Beyond music applications, the industrial applications for show control would explode. Every major OS has seen fit to include MIDI natively - as the browser becomes the default "operating system" for screens around the world, this established and important hardware interface needs to be part of this growth.
We all agree this should be implemented. The question is who is going to do it and whether other things that are _also_ important should be dropped to implement this. Again, if people are willing to step up and help implement that would probably help a lot in terms of finding people to work on this.
I wanted to cast my vote of support for getting the MIDI API implemented. Although many seem to not understand the importance of MIDI and it's status as a long-standing standard for real-time I/O control, I personally cannot imagine us achieving the dream of the "browser being the platform" without both low-level audio AND MIDI support. If every OS on the market found the need to add MIDI functionality to their product, so should browser developers. From lighting and show control, to music education, to gaming, to controlling industrial machinery, MIDI has seemingly limitless applications. If you ponder MIDI’s potential in the home automation market alone, it is mind-boggling. Having said that, I will now tell you that I work for one of the biggest audio manufacturers in the world. Obviously, MIDI plays a huge part in most things we do...but that is probably not enough to have the API put in a browser. It should be in the browser because of the potential for developers to make applications that reach out of the browser and interact with the "real" world. Again, the applications are almost limitless. My company is a month away from shipping the follow-up to one of our biggest selling products in our history, with expected initial sales to be thousands and thousands of units a month. This products will ship with a PC/Mac desktop Editor/Librarian app. We have spent a considerable amount of time and resources developing this app. Before the product even ships, we are already beginning the process of porting the app over to the browser, utilizing the poly-fill/Jazz Soft midi plug in. If history repeats itself, that would be potentially 500,000 users of the web app. Strong stuff. In closing, I am not excited about this because of who I work for. I am excited about this because I spent 15 years as a web application developer and I am passionate about the potential of the platform. Scott Mire Peavey Electronics
Guys, this is not the correct place to advocate for this. Nobody has said that we _don't_ want to support Web MIDI. It's just that there are a lot of more important things for us to work on.
Responding to a feature request by advocating lower priority, and then telling the community that they're out of place for advocating higher priority, is unsupportable. This absolutely is the place to advocate. +1 for Web MIDI. It's a higher priority than Web Audio for lots of people. Web MIDI would allow the genesis of a new class of web based applications that leverage the vast ecosystem of MIDI-enabled technologies. Boris has the correct view that it's a matter of resourcing, and I get what Ehsan is saying. I want to be constructive here, so I'll see if I can drum up support from our OttawaJS meetup. We have a lot of good devs including a couple of Mozilla interns.
I'll give this a shot, though it won't be a full time initiative (I've still got lots of commitments on FxOS). I've done a bit of MIDI driver work (hi pete!) as well as WebAPI development (RIL and bluetooth for FxOS, gamepad oop still in development), so I'm capable of breaking lots of things. If there's anyone watching this that's interested in helping out (platform specific MIDI would be great, I've got some experience on mac and linux but haven't worked on MIDI code in windows), please let me know by commenting here. First task is probably dividing down into subtasks, getting WebAPI matching the current spec state and fleshing out HAL needs per platform.
Kyle, I started looking into the rtmidi open source cross platform midi classes. Didn't get as far as writing an xpcom component but I'm still interested and would like to help.
Things have changed enough that there probably won't be XPCOM involved (See the recent post on dev.platform about no more xpcom for web facing objects). Off the top of my head, I'm guessing we'll do all of the platform work back in the HAL, have an IPDL protocol in front of that for IPC, then the DOM side will be WebIDL backed by whatever language is easiest and fulfills needs (been a while since I've touched DOM bindings so I'm not sure if we can deal with events in js impls yet). rtmidi looks neat, I'll check it out, though I doubt we'll be able to use it. Chrome's also got a webmidi implementation already, though it needs to be pref'd on. Not sure if that landed in Webkit before the fork or went with blink though. Something to look at though. Gonna try to get a branch running off my gecko-dev repo ASAP, will update bug once that's done. Will also break out above tasks into blocking bugs for this so it'll be easier to figure out what's needed.
Events in JS impls work fine, both in terms of subclassing Event and subclassing EventTarget.
You will definitely want the delivery of scheduled outgoing messages and timestamping of incoming messages to not be done in JS, as those need to be tightly scheduled*; but the actual delivery of messages ends up being events or JS method anyway, so not critically bad. * i.e. the WebMIDIAPIShim polyfill can never be good enough, because it has to use setTimeout to deliver scheduled-in-the-future send() messages. However, the underlying NPAPI control is delivering messages with system-level timestamps, so the input part is okay in terms of timing.
Yeah, that's the idea. The way it looks in my head currently is: OS API <-> HAL (C++) <-> IPDL <-> DOM API (JS) Everything that has to do with timing and communications to the world will happen in the HAL, as putting that before the JS/IPDL would be questionable in terms of scheduling. Doing the DOM work to get things out to content is mostly to keep things cleaner, though if it for some reasons causes massive issues it can always be rethought.
Right. The part you'll have to do yourself in that case is scheduling on Windows; the Windows MIDI API doesn't have a "send this message at xxx time" method, or a system-level timestamp on incoming messages. (CoreMIDI on OSX does; it has timestamps on MIDIPackets.) That will have to not be blocked on JS execution or GC, so it will need to happen on the left side of IPDL.
If we want to ever use that in workers (and I think it's likely) we can't use js. Easy decision ;)
Well, there you go then. Access to Web MIDI from Workers is fairly high priority in the issues list - https://github.com/WebAudio/web-midi-api/issues/99.
Ok, not a prob, glad this got hashed out now then. Not like there's much happening up there anyways, was just trying to make the boilerplate simpler. :)
And so it begins. https://github.com/qdot/gecko-dev/tree/836897-webmidi Also posted intent to implement on dev-platform and dev-webapi. So we'll see how that goes. Right now I'm just getting WebIDL and DOM boilerplate in. Then it's on to IPDL (the protocol should be pretty simple), then a HAL template that'll send over test data so I can run mochitests. Once the HAL template is done, we can start creating platform specific versions to actually make things work. Going to break this out into multiple bugs, one for WebIDL and DOM implementation, and one for each platform for the HAL. Assuming this actually keeps momentum for more than like, today, I'll go ahead and schedule a sec review, though I imagine due to priority we'll sit on the bottom of the pile for a while.
A new draft was published last week: http://webaudio.github.io/web-midi-api/
Branch at https://github.com/qdot/gecko-dev/tree/836897-webmidi Is now up to date and back in motion. Right now, working on all of the DOM/IPC plumbing. Goal is to get that done with some sort of simple shim layer under it for testing/completeness sake, then starting work on the platform specific bits.
Just finished the permissions dialog implementation for this, filing for sec-review early since this is going to be an API used on desktop that hits hardware.
Now supported by Google Chrome http://arstechnica.com/information-technology/2015/05/google-chrome-gains-midi-support-enables-web-based-synths-and-daws/
Yup, we've been talking to the Chrome people, super happy to see them finally get it shipped. Things are moving along on this side, bug 1123516 (which is a requirement for the WebIDL needs for WebMIDI) is almost done, at which point hopefully there will be more motion over here.
So, since I haven't updated this bug in forever... Assuming anyone really wants to follow code, my dev branch is at https://github.com/qdot/gecko-hg/tree/836897-webmidi now. I've been working on this full time for a while now, and am almost done with the gecko specific parts. Most of the WebAPI is implemented (both in-process and e10s'd), mainly down to filling out unit tests and making sure we comply with the spec. Once that's done, I'll be putting the patch up for review (and apologizing in advance to reviewers, it's gonna be a big one. :/ ), and starting work on the platform specific portions.
Really excited to see this. Ping me if there's anything I can do to help in testing, etc.
(In reply to Chris Wilson from comment #32) > Really excited to see this. Ping me if there's anything I can do to help in > testing, etc. Chris, are there web platform tests?
I am writing platform test. But it is not ready to submit yet.... Can you use that test which is still creating by me?? https://github.com/ryoyakawai/web-platform-tests/tree/submission/webmidi/webmidi
So, as some might've seen on twitter, I've got hardware interaction working on OS X. This includes device enumeration, and message sending/receiving. That said, this is still what I'd classify as a pre-alpha state, because there are quite a few hacks that need to be cleaned up before I can get this into reviews. The current todo list, off the top of my head: - DOM -- Packet scheduler (right now we ignore timestamps and just throw stuff over) -- Clear() handling to spec -- Fix port information handling/passing -- Cleanup and comment fixing -- Permissions string l10n - Mac -- Device notification handling (Having CFRunLoop issues) -- General cleanup -- Figuring out licensing (I adapted this off of chromium's midi_manager_mac object) - Windows -- Not Started - Linux -- Not Started - Android -- Not started, nor high priority. If you're an android dev that wants to do this, please feel free to take the bug! I've turned this bug into a metabug to track all of the MIDI work, so it can land separately and we can get testing happening ASAP.
This is awesome! I've been looking forward to this for some time. I am working on automated WebVR input latency testing (possibly to be used with Talos) and can see how the same is important for WebMIDI. The concept is to implement a puppet that can be scripted to fire events with precise timing and measure response time. Perhaps the same system could be used for WebMIDI as well.
Any update/target date on WebMIDI support? Talked to a number of manufacturers at NAMM last week who'd like to see it. :)
(In reply to Chris Wilson from comment #37) > Any update/target date on WebMIDI support? Talked to a number of > manufacturers at NAMM last week who'd like to see it. :) So far the project is Kyle's - I think there are a bunch of MIDI nerds in Mozilla ( myself included! ) that are excited about it and Kyle is making progress but it's unlikely to attract additional resources unless through community contribution.
So here's where things are: - Backend DOM Stuff: Like 97% done. Was gonna try to get the timestamp stuff cleaned up, got stalled on that in November, then got yanked onto some other stuff. I'm going to try to get it rebased, slightly more commented, and into review in the next couple of weeks, barring any major interruptions. I suspect reviews are going to take a while though, it's a big ol' chunk of code. - OSX Platform Specifics: Basically still working like it always did. With some of the work happening on moving the Gamepad API to our PBackground IPC mechanism, they solved one of the issues I'd run into with device hotplugging, so we need to integrate that. - Linux/Windows Platform Specifics: Unfortunately no real movement here yet. I've talked to Adam Goode from the Blink team about this a bit, and he said he'd at least possibly be available to answer questions. We'll probably use the chromium implementation for reference, much as we did the OS X backend.
You give me too much credit! I've not done much for Chromium/Blink outside of Linux WebMIDI. But happy to help as much as I can.
An update for anyone watching this, the DOM implementation of WebMIDI is now in for review on Bug 1201590. I would suspect this is going to take a few (or more) review rounds, and this doesn't involve anything platform specific, but it's a step in a direction!
Any news about that API?
Kiiiiiiiinda? Right now I'm busy on WebAuthn implementation, which was higher priority. However, it shares some aspects of this bug, in that the hardware access code will be rust, and it has a somewhat similar IPC setup. The architectural questions I have to answer to finish WebAuthn will at least give us a path for parts of WebMIDI implementation. That said, I still don't know when I'm gonna have time to work on it. :(
Now that version 52 has stripped out all NPAPI plugins, including mozplugger, this is more urgently needed. Firefox currently has no support for MIDI at all :(
We should be aware of, that MIDI susport means two different things. This issue is about Web MIDI API, that is an API to attach MIDI devices to the browser. It does however NOT cover playback of Standard MIDI files in any way. With mozplugger now dead, the playback of MIDI files has finally gone. Web sites may use http://www.midijs.net as a intermediate replacement to play MIDI files from their pages. However, a different issue shoud be openend (or does already exist?) to support playback of MIDI files via <audio>.
After reading the description: >it also enables access to external and internal synthesizers, such as the built-in software synthesizers on Windows and Mac OS I was under the impression the WebMIDI API could be used to pipe MIDI data to an external synthesizer. Do you mean there'd be no interface, internal to firefox, to connect MIDI files embedded in webpages to the Web MIDI API? If firefox is going to support this API, it doesn't seem so far-fetched that a way could be found to do that.
>external synthesizer It occurs to me that these terms are problematic; I don't mean a separate piece of hardware. This is how most programs using MIDI for audio work. One usually configures, within the program, a driver or port to send the MIDI data to, and then that driver or something listening on that port handles it from there. In Windows and MacOS they are shipping standardized software synthesizers; in Linux there are several available, but timidity++ is probably the best target for a default. MIDI is not rendered as audio data inside the program with MIDI music; it is exported to the driver or port and rendered into audio outside of the program. I think this can be achieved by WebMIDI API in firefox, if there would be an interface to connect embedded midi files to WebMIDI API and firefox had cross-platform ready configuration to pipe that data out to a synthesizer.