Ever seen one of those pianos which plays on its own? No? Well, if you haven't seen one, the reason it's able to play on its own is because of a roll of paper inside which looks like this:
This one was on display when I visited the house on the rock in 2016. How it works is the roll inside determines which pneumatic components inside the piano activate each key (as well as other components like the pedals).
I'm not sure if this was by coincidence or design, but MIDI track views look very similar. A MIDI track view is a user interface designed for music producers, composers, and other musicians who are working with MIDI. It is often the way MIDI data is displayed in digital audio workstations.
It works in the same way the piano roll does (except rotated 90 degrees) by displaying each note as a rectangle from top to bottom by pitch on the y-axis. The x-axis determines the time where each note occurs (y axis for the mechanical roll above).
Back in late 2019-early 2020, I was working on some tutorials using AudioKit,
and I wanted to make one about MIDI track views. However, I couldn't find this MIDI track view anywhere in the AudioKit codebase. That's
when I realized I was going to be the one to make it, which ultimately led to my first major contribution. I ended up pushing the first version implemented in UIKit to the repo in late June 2020: https://github.com/AudioKit/AudioKit/commit/d1bb87f5b51908c23b0ef1e9e4813f8d36c459f3
I was also proud I managed to get it working on Mac Catalyst around the same time.
However, this release did not come without its flaws. One challenge was the tempo was not yet correctly synced to the MIDI track playback. Another was that the rendering was being done inefficiently. Each note was having its own rectangle drawn.
Near the end of 2020, the UI components of AudioKit were being moved into a new repository, AudioKitUI. Many of these components were being re-written in SwiftUI, and it was suggested that I try to re-implement my existing implementation into a pure SwiftUI (no UIKit) MIDITrackView using Canvas.
I decided to use the .drawingGroup view modifier which composites all of the separate views being rendered within a SwiftUI view into a single path—similar to what Canvas does. This made the implementation much more performant because the view didn't need to draw a rectangle for each note anymore.
Additionally, I de-coupled the tempo from the view by having the view's playback position be a separate parameter which could be passed into the view model (i.e. via a sequencer which publishes the current track position). This made things more reliable because the track could base its position off the passed in sequencer position which was already affected by the track tempo and any other MIDI automation events.
The result was a fluid experience across platforms.
Now, let's try to break it with some ridiculous MIDI files which would probably ignite any player piano to flames in real life (happened here in a video from Mark Rober)!
The frame rate on these examples is hard to look at, but it taught me about the importance of scalability and the impact it has on user experience. The old MIDI Track View would've never been able to play these tracks (let alone load them) because it would've been too busy trying to draw whatever rectangles it could. There's still lots of room for improvement, and I could probably make it better by having some sort of interpolation between zoom levels. The current code can be found here: https://github.com/AudioKit/MIDITrackView