Jam on your MIDI keyboard in Angular
Web MIDI API is an interesting tool. Even though it has been around for a while now, it is still only supported by Chrome. But that's not gonna stop us from creating a playable synthesizer in Angular. It is time we bring Web Audio API to the next level!Previously, we spoke about declarative use of Web Audio API in Angular . Programming music is fun and all but how about actually playing it? There is a standard called MIDI. It's a messaging protocol for electronic instruments data exchange developed back in the 80s. And Chrome supports it natively . It means if you have a synthesizer or a MIDI keyboard — you can hook it up to a computer and read what's played in the browser. You can even control other hardware from your machine. Let's learn how to do it the Angular way. There's not a lot of documentation about it other than the specs. You request access to MIDI devices from navigator and you receive all MIDI inputs and outputs in a Promise . Those inputs and outputs (also called ports) act like event targets. Communication is performed through MIDIMessageEvents which carry Uint8Array data. Each message is 3 bytes at most. The first one is called a status byte . Every number has a particular role like key press or pitch bend. Second and third integers are called data bytes . In the case of key press second byte tells us what key is pressed and the third one is the velocity (how loud note is played). Full spec is available on the official MIDI website . In Angular we handle events with Observables so first step to adopt Web MIDI API would be to convert it to RxJs. To subscribe to events we first need to get MIDIAccess object to reach all inputs. As mentioned before, we request it from navigator and we get a Promise as a response. Thankfully, RxJs works with Promises too. We can create an Injection Token using NAVIGATOR from @ng-web-apis/common package. This way we are not using global objects directly: Now that we have it, we can subscribe to all MIDI events. We can make Observable two ways: Since in this case there is not much set up required a token would suffice. With a bit of extra code to handle Promise rejection, subscribing to all events would look like this: We can extract particular MIDI port from MIDIAccess in case we want to, say, send an outgoing message. Let's add another token and a prepared provider to do this with ease: To work with our stream we need to add some custom operators. After all, we shouldn't always analyze raw event data to understand what we're dealing with. Operators can be roughly broken into two categories: monotype and mapping . With the first group, we can filter out the stream to events of interest. For example, only listen to played notes or volume sliders. Second group would alter elements for us, like dropping the rest of the event and only deliver data array. Here's how we can listen to messages only from a given channel (out of 16): The status byte is organized in groups of 16: 128–143 are noteOff messages for each of the 16 channels, 144–159 are noteOn, etc. So if we divide status byte by 16 and get the reminder — we end up with that message's channel. If we only care about played notes, we can write the following operator: Now we can chain such operators to get a stream we need: Time put all this to work! With a little help from our Web Audio API library discussed in my previous article , we can create a nice sounding synth with just a few directives. Then we need to feed it played notes from the stream we've assembled. We will use the last code example as a starting point. To have polyphonic synthesizer we need to keep track of all played notes so we will add scan to our chain: To alter the volume of held keys and to not end sound abruptly when we let go we will create a proper ADSR-pipe (the previous article had a simplified version): With this pipe we can throw in a fine synth in the template: We iterate over accumulated notes with built-in keyvalue pipe tracking items by played key. Then we have 2 oscillators playing those frequencies. And at the end — a reverberation effect with ConvolverNode . Pretty basic setup and not a lot of code, but it gives us a playable instrument with rich sound. You can go ahead and give it a try in our interactive demo below. In Angular, we are used to working with events using RxJs. And Web MIDI API is not much different from regular events. With some architectural decisions and tokens, we managed to use MIDI in an Angular app. The solution we created is available as @ng-web-apis/midi open-source package. It focuses mostly on receiving events. If you see something that is missing like a helper function or another operator — feel free to open an issue . This library is a part of a bigger project called Web APIs for Angular — an initiative to create lightweight, high-quality wrappers of native APIs for idiomatic use with Angular. So if you want to try Payment Request API or need a declarative Intersection Observer — you are very welcome to browse all our releases so far .