Typebeat, checkpoint 0.1.0

Kofi Gumbs ·

I occasionally make music. And like many a programmer-turned-music-maker, my musical process often devolves into writing code to optimize various parts of my workflow. Typebeat is my latest contribution to that tradition. It’s a virtual groovebox that’s entirely keyboard-operated:

Edit (2022-03-19): I decided to make the source code public as well. Enjoy! github.com/kofigumbs/typebeat

You can play with a pre-alpha demo at typebeat.kofi.sexy. I think it would be fun to release a more substantial version someday, but that version would look much different than today’s. Before making that detour, I wanted to hit pause and reflect on how the project came to this point.

Typebeat draws inspiration from two computing environments that make me feel productive and creative. The first is live-coding, the performance art of writing code to generate media. I love watching live-coding shows because the performers typically insist on showing their work. The code is as much a part of the performance as the resulting visuals and audio. Live-coders use their input device as their instrument by creating programs that the computer then interprets. I think of Typebeat as starting with that idea, keyboard as instrument, and making the analogy more direct. Press N to play a drum kick. Press K to play a clap. Type NYKY to play a “boots and cats” house beat.

Of course this sort of direct mapping doesn’t scale: there are more musical commands than there are keys on a keyboard. So Typebeat also takes inspiration from vi and incorporates a modal interface. In Audition Mode, Typebeat works as I mentioned above, with each key triggering a different sound. But in Note Mode, Typebeat transposes the active sound, letting you access notes in a 2-octave range. Send Mode lets you route your sound to different effect buses, and then Return Mode lets you configure what each of those effect buses actually do. Like my experience with vi, Typebeat is incredibly fast to navigate once learned.


I’m calling this post a checkpoint and not a release since it’s not ready for general use. Focusing on the keyboard input device was a helpful design constraint, but I think I insisted on it too heavily. Sometimes, particularly when I don’t know exactly what I want to hear, playing with knobs and sliders helps me to discover new ideas. Nudging on-screen values with a keyboard does not produce that same effect for me. Similarly, some ideas are just difficult to fit onto a screen if I insist that everything must be viewed as if it were literally a keyboard. Moving forward, I’m planning to experiment more with analog input devices and make the workflow more focused.