A new University of Southern California computer system lets a user
"drive" a piece of music, using a wheel and foot controls. The
Expression Synthesis Project (ESP) interface, devised by a team led by
Elaine Chew of the USC Viterbi School of Engineering, could be in the
hands of consumers within two years.
Top: Grad student Jie Liu takes Brahms Hungarian Dance #5 for a spin. Below: Team ESP: From left: Elaine Chew, Alexandre Francois, Liu (at wheel), Aaron
Chew presented ESP May 28 at the New Interfaces for Musical Expression
(NIME) 2005 conference at the University of British Columbia in
Chew is an assistant professor in the Viterbi School's Daniel J.
Epstein Department of Industrial and Systems Engineering who is also an
accomplished professional pianist who carries on a regular schedule of
concert appearances. She says ESP "allows everyone a chance to
experience what it's like to perform. It lets them appreciate the
decisions made by a musician in interpreting the music."
ESP "attempts to provide a driving interface for musical expression,"
according to Chew's published description. "The premise of ESP is that
driving serves as an effective metaphor for expressive music
performance. Not everyone can play an instrument but almost anyone can
drive a car. By using a familiar interface, ESP aims to provide a
compelling metaphor for expressive performance so as to make high-level
expressive decisions accessible to non-experts"
Created by Chew, research assistant professor Alexandre R.J. François
of the Viterbi School's department of computer science, and graduate
students Jie Liu and Aaron Yang, ESP starts with a piece of music that
has been converted from the printed score to the Musical Instrument
Digital Interface (MIDI) format. MIDI is the standard control language
for driving musical synthesizers or other devices. François' Software
Architecture for Immersipresence framework, and the corresponding open
source architectural middleware, developed since 2001, are important
enabling elements in the design and implementation of the system..
The score used as the test case in the development of ESP is the
Hungarian Dance No. 5 in G-minor by Johannes Brahms. The piece was
selected because it contains numerous moments of extreme speed ups and
To guide the musical performance, Chew and her colleagues used
information from the score to create a "road" that corresponds to the
structure of the piece. This is necessary, says Chew, because crucial
cues from the score and its analysis, necessary for an informed
performance, are not captured in the MIDI file.
The group is building tools to automate the process of creating such
roads, applying artificial intelligence techniques to the analysis of
the score. "Having the road build itself will be the most difficult
part," says François.
The road's turns suggest to the driver when to slow down and speed up.
However, the ultimate decision on what to do at each turn is entirely
in the driver's hands (or foot).Â The foot pedals control both the
tempo and the volume of the music. Additionally, buttons mounted on the
wheel (see photo) act as the equivalent of the pedals on the piano,
making the notes either sustain or cut off crisply.
Chew has carried on the ESP research at the Viterbi School's Integrated
Media Systems Center, where she is Research Director for Human
Performance Engineering. She is the winner of an Early Career Award
from the National Science Foundation. NSF funded the ESP research.
She hopes ESP will open new doors into music for non-musicians, a
chance "to try making and evaluating musical decisions themselves, to
see what it's like to perform." Chew's presentation can be seen at: http://www-rcf.usc.edu/~echew/papers/NIME2005