A project at the University of Oregon Cognitive Modeling and Eye Tracking Lab in which an eye tracker is connected to a multimedia performance environment to create computer music and interactive art based on eye movements.
Project Personnel: Anthony Hornof, Troy Rogers, Tim Halverson, Jeffrey Stolet, and Linda Sato. Dr. Stolet and Troy Rogers are affiliated with Future Music Oregon.
Watch a QuickTime
(Try to listen with good stereo separation, as with headphones.)
The above photo shows Anthony Hornof practicing EyeMusic v1.0 a few days before performing the piece at NIME 2007 (New Interfaces for Musical Expression) in New York City.
The above photo is a composite of two photos, showing Troy Rogers rehearsing EyeMusic v1.0 at SEAMUS 2006, and a screenshot of the video image that was projected behind him during the performance. Troy is standing at the front of a small auditorium, facing the audience, with his chin in a chinrest, staring intently at a multi-computer console (including an eye tracker) which are in front of him. A video display in front of Troy shows the audience a close-up video image of his eye. On the wall behind him is "projected" a simplified graphic image of an eye, surrounded by a swarm of red dots. EyeMusic v1.0 was composed by Rogers, Hornof, and Halverson in 2005.
EyeMusic is a project that explores how eye movements can be sonified to show where a person is looking using sound, and how this sonification can be used in real time to create music. An eye tracking device (the LC Technologies Eyegaze Communication System) reports where the performer is looking on the computer screen, as well as other parameters pertaining to the status of the eyes. The eye tracker reports these data in real time to a computer program (written using Max/MSP/Jitter). The computer program generates and modifies sounds and images based on these data.
While the eye is, in ordinary human usage, an organ of perception, EyeMusic allows for it to be a manipulator as well. EyeMusic creates an unusual feedback loop. The performer may be motivated to look at a physical location either to process it visually (the usual motivation for an eye movement) or to create a sound (a new motivation). These two motivations can work together to achieve perceptual-motor harmony and also to create music along the way. The two motivations can also generate some conflict, though, as when the gaze must move close to an object without looking directly at it, to set up a specific sonic or visual effect. Through it all, EyeMusic explores how the eyes can be used to directly perform a musical composition.
An overview of the EyeMusic system components. Arrows indicate the flow of data.
Hornof, A., Rogers, T., Stolet, J., & Halverson, T. (2008). Bringing to Life the Musical Properties of the Eyes. Department of CIS Technical Report 08-05, University of Oregon. Ten pages.
Hornof, A. J., Rogers, T., & Halverson, T. (2007). EyeMusic: Performing live music and multimedia compositions with eye movements. Poster presented at NIME 2007: Conference on New Interfaces for Musical Expression. In proceedings on pp. 299-300.
Hornof, A., & Sato, L. (2004). EyeMusic: Making Music with the Eyes. Proceedings of the 2004 Conference on New Interfaces for Musical Expression (NIME04), Hamamatsu, Japan, June 3-5, 185-188.
This photo shows a still image from an early exploratory EyeMusic piece by Hornof, Stolet, Sato, and Halverson (2004). The video shows, both visually and sonically, how people move their eyes while reading. The article being read also happens to describe, using words, how people move their eyes while reading.
Please contact Anthony Hornof for more information on EyeMusic.