Final project for COMP_SCI 497 (RTVF 376-0-20): Digital Musical Instrument Design | Northwestern University

Meet Headhunter: a head-tracking MIDI controller. It allows the user to control a MIDI-compatible instrument using… well, their head.
Headhunter utilizes the sensors already built into smartphones to track head movements, then translates that data into corresponding MIDI messages using Max. It is fully reprogrammable, meaning the user can easily customize Headhunter to map specific head wobbles to whatever MIDI parameter they want.
My inspiration for Headhunter comes from Keith Jarrett, one of my favorite jazz pianists. (Herbie Hancock also ranks high on my list, as hinted by the name Headhunter.)
Besides his constant and notorious moaning, Mr. Jarrett is known for his dramatic head movements while playing the piano.

These movements, often deemed distracting, inspired me to think: What if these “redundant” movements could actually control some aspect of the sound being played? By integrating these gymnastics into the musical performance, they become engaging for the audience, providing auditory feedback synchronized with the visuals. This also offers keyboard players significantly more freedom, enabling actions like bending notes or adjusting filters while both hands are busy playing chords—no extra knobs or sliders required.
To bring this idea to life, I began with a proof of concept.
First, I researched head tracking and discovered an open-source software called OpenTrack. It supports various head-tracking inputs, including smartphones, making it accessible to musicians without the need to wear dedicated, sci-fi looking headsets.
Next, I worked on converting the yaw, pitch, and roll data from OpenTrack into a format Max could recognize. I decided to use the built-in virtual joystick output protocol to encode the head-tracking information into a virtual joystick that Max could then process.
The final step was figuring out how to properly map three virtual joystick inputs to MIDI messages. These inputs are integers from 0-65535, whereas MIDI note bend messages and continuous control messages are values from 0-127. A simple linear mapping would not be practical: first, it is physically painful to move one’s head far enough to reach value 0 or 65536, and second, if the tracking is too sensitive, it becomes impossible to stay in tune and accidental changes are inevitable. After extensive trial, error, and neck stretching, I created several dedicated JavaScript objects in Max to handle these complex mapping rules.
Currently, changing hyperparameters in the mapping rule sets requires editing the JavaScript code directly. In the future, I aim to create user-friendly sliders or knobs in the Max patch for easier parameter tuning.
Additionally, combining all the separate software components into a single executable file would significantly improve the user experience.
Finally, I plan to replace the smartphone with a dedicated orientation sensor, making the headset lighter and eliminating the need to strap a phone to one’s forehead.
In this demo, I am playing a Rhodes Piano and an ARP Solina string ensemble. The volume of the ARP Solina is set to 0 at the beginning.
The yaw axis (shaking head “no”) is mapped to pitch bend, creating a vibrato/out-of-tune effect as I turn my head left and right.
The pitch axis (nodding “yes”) is mapped to the volume of the ARP Solina, so the synth string sound fades in and out as I look up and down.
ctlout and bendout objects to the MIDI channel you created in Step 4. Select vjoy device as your input, and click poll 10 to start the Max patch.Download The Headhunter Max Patch and Javascript Files