Final project for COMP_SCI 497 (RTVF 376-0-20): Digital Musical Instrument Design | Northwestern University
Headhunter is a head-tracking MIDI controller. It allows user to use their head to control a MIDI-compatible instrument.
Headhunter uses sensors in smartphones to track head movements. It translates these data to corresponding MIDI messages using Max. User can easily customize Headhunter by remapping head movements to different MIDI messages.
My inspiration for Headhunter stems from Keith Jarrett, one of my favorite jazz pianists. (Herbie Hancock also ranks high on my list, as hinted by the name Headhunter.)
Besides his constant and notorious moaning, Mr. Jarrett is known for his dramatic head movements while playing the piano.
These movements, often deemed distracting, inspired me to think: What if these redundant movements could control some aspect of the sound being played? By integrating these movements into the musical performance, they could become engaging for the audience, providing auditory feedback synchronized with the visuals. This would also offer keyboard players more freedom, enabling actions like bending notes while playing two-hand chords and adjusting parameters without touching any knobs or sliders.
To bring this idea to life, I began with a proof of concept.
First, I researched head tracking and discovered an open-source software called OpenTrack. It supports various head-tracking inputs, including smartphones, making it accessible to musicians without the need for dedicated head-tracking headsets.
Next, I worked on converting the yaw, pitch, and roll data from OpenTrack into a format Max could recognize. I decided to use the built-in virtual joystick output protocol to encode the head-tracking information into a virtual joystick that Max could then process.
The final step is to figure out how to properly map three virtual joystick inputs to MIDI messages. These inputs are integers from 0-65535, and MIDI note bend messages and continuous control messages are all values from 0-127. A simple linear mapping wouldn’t do because, first, it is really hard to move one’s head to reach both value 0 and 65536; also, if tracking is too sensitive, it will be hard to stay in tune and accidental changes might occur. After extensive trial and error, I created several dedicated JavaScript objects in Max to handle these complex mapping rules.
Currently, changing hyperparameters in the mapping rule sets requires editing the JavaScript code. In the future, I aim to create sliders or knobs in the Max patch for easier parameter tuning.
Additionally, combining all the separate software into a single executable file would significantly make it more user-friendly.
Finally, I plan to replace the smartphone with a dedicated orientation sensor, making the headset lighter and eliminating the need for a smartphone app.
In this demo, I am playing a Rhodes and an ARP Solina string ensemble. The volume of the ARP Solina is set to 0 at the beginning.
The yaw axis is mapped to pitch bend, creating a vibrato/out-of-tune effect as I turn my head left and right.
The pitch axis is mapped to the volume of the ARP Solina, so the synth string sound fades in and out as I look up and down.
ctlout
and bendout
objects to the MIDI channel you created in Step 4. Select vjoy device
as your input, and click poll 10
to start the Max patch.Download The Headhunter Max Patch and Javascript Files