Logo: University of Southern California

Engineering a Mantra

Illumin editor attends Viterbi professor's immersive sound staging of piano duet
Ilya Golosker
February 22, 2008 — Two pianos faced each other on stage, equidistant from three speakers mounted on the front wall. The auditorium's soft black walls camouflaged some of the acoustic equipment, but as I looked closely I could see more speakers surrounding the audience. After three days of wiring and tuning, nothing was there by accident.
Immersive sound inventor Chris Kyriakakis staged and wired the Feb. 13 Disney Hall perfromance of Mantra

Microphones  inhabited the body of each piano, and a control table with computers, modulators, and other hardware kept three more engineers busy even before the performance started. Each pianist was surrounded on three sides by a set of crotales (small tuned cymbals) and a wood block on top of the piano, as well as a MIDI controller with sliders and knobs.

It was February 13 and I was at a Disney Hall performance of Karlheinz Stockhausen’s Mantra, a performance staged by Viterbi School electrical engineering professor Chris Kyriakakis that utilized the immersive audio technology he developed at USC. I came as part of several busloads of USC students, a diverse mix of ages and majors filling out the crowd at Redcat (the Roy & Edna Disney/Cal Arts Theater).

I did not know what to expect.

Stockhausen  (1928-2007)  wrote the duet for two pianos equipped with a synthesizer and two speakers.  He commissioned a special modulator that would reverberate the sound made by each piano, with the frequency controlled by a single knob to change the effect of the sound throughout the 90-minute piece, with the goal of bringing the audience “inside the music.”

The technology of 10.2 surround sound used in the Redcat performance is a natural step toward realizing Stockhausen’s goal of spatial sound for his 1970 work. 10.2 uses 14 distinct channels and 10 speakers with two subwoofers. Not being limited by space on a film strip (which dictated the maximum amount of channels in 5.1), 10.2 software that controls the movement of sound is used to localize sound and create a seamless envelopment around the audience.

Pianist Katherine Chi  told the audience that the performance was creating what Stockhausen was not able to execute. Without doubt, the late German composer would have enjoyed the show.

Stockhausen
Mantra's skeleton is a series of 13 notes, consisting of 12 pitches of the conventional scale, ending on the same pitch as it began. Stockhausen, however, repeats this formula in various manifestations a total of 84 times, with each section unique in style but familiar in tone to its neighbors. The two pianos alternate between seamless cooperation and jealous hostility, and seemingly everywhere in between.

 
pianist Katherine Chi
The sound was subtle at first, and I was surprised to find the pianos sounding like… pianos. But when I closed my eyes and took in the music, more and more subtlety came through. If a single note was played, staccato and forte, the initial impact was not as interesting as the aftereffect, how the sound flowed through different frequencies and traveled between the channels and across the room, surrounding the audience like a standing wave or rolling like a marble on a concave floor. This is the synergy of art and technology, the audience inside the music.

While we sat watching and listening, Chi and her fellow pianist Hugh Hinton barely had a moment of peace. The formulaic style of Mantra left no room for improvisation, and each pianist alternated between playing the keyboard, the crotales and the wood block, and modulating the frequencies (also carefully dictated by Stockhausen).

I was particularly impressed by Chi and Hinton's ability play while sometimes holding two (or even four) mallets ready to hit the crotales. At one point in the performance, both pianists stood up and started chanting, their voices captured and mutated. The modulated sounds from the pianos varied with each version of Mantra, from a quiet motorized whirr of a camera’s lens, to the charging of a sci-fi laser gun. One of the final versions used Morse code to accompany the pianists.

I spent half the time watching the middle of the crowd, as a separate performance was going on. The three engineers were monitoring all the channels, supervising the mixing, and changing frequencies on the fly.

 
Page from Mantra score
This extra effort showed the complexity and sophistication of 10.2 immersive sound, with the goal of having the music move effortlessly around the audience. A programming language was adapted to control the modulation, and serious processing power is required to compute and render the algorithms that so carefully control the sound and make it so lifelike.

Immersive sound is the result of a decade’s worth of research and development by Kyriakakis and Tomlinson Holman at the USC Immersive Audio Laboratory. Kyriakakis teaches in the Ming Hsieh Department of Electrical Engineering, and Holman is a professor at the School of Cinematic Arts who also has an electrical engineering appointment. Like Mantra itself, their partnership is a synthesis of art and engineering. Kyriakakis sees 10.2 as the next step in surround sound technology, eventually replacing the ubiquitous 5.1, tailoring sound to environments so every seat is the best in the house.

The crowd was definitely impressed by the technology as well as the performance, and a question and answer session followed. Everything came together at the right time for Mantra, a coordination of technology and art, and the cooperation of ambitious engineers and open minded performers. As I rode back on the bus, I felt the privilege of having seen a glimpse of what is to come, and how research and engineering travel from the laboratory to the stage.


By Ilya Golosker (M.E. 2009)
Associate Editor, Illumin Magazine