Logo: University of Southern California

Quantifying Human Behavior — One MoCap Data Point at a Time

Engineering and Theatre Faculty partner to develop science around human behavior.

June 23, 2010 —

Two actors wrapped in motion sensors circle each other, as engineering researchers stand at the perimeter of a USC Viterbi School of Engineering laboratory, taking note.

It’s an unusual partnership between the artists and engineers, and a union the National Science Foundation (NSF) expects will achieve more precise methods of modeling human behavior.

Mocap 1
USC faculty Sharon Carnicke and Shri Narayanan prepare for the actors' next exercise in a laboratory at the USC Viterbi School of Engineering.
The NSF, under its Creative Information Technologies program, has awarded a three-year grant to faculty from the Viterbi School and the USC School of Theatre to study expressive human behavior through improvisation and motion capture technology.

“The Holy Grail is to be able to build technologies to mimic aspects of human behavior,” says Shri Narayanan, the project's principal investigator, the Andrew J. Viterbi Professor of Engineering, and professor of electrical engineering and computer science.

Armed with such sophisticated technologies, scientists could build technologies to help autistic children, create advanced methods for recognizing human speech and visual behavior, and perhaps even quantify humor.

“The applications are limitless given the fundamental nature of the issue we’re addressing — understanding human behavior,” says Sharon Carnicke, a professor in the USC School of Theatre and Narayanan’s co-investigator on the project.

Viterbi Dean Yannis Yortsos noted that Narayanan and Carnicke’s partnership exemplifies the type of multidisciplinary alliance that yields results.

“Solving the world’s complex challenges will require a seamless blending of left-brain and right-brain skills,” says Yortsos. “The future impact of engineering and other disciplines depends on such partnerships.”

Mocap 5
Theatre students Rose Leisner and Thomas Krottinger conduct a warmup exercise.

In the lab, Narayanan and Carnicke seek to collect digital representations of human emotion and behavior, one bit at a time. Drawing upon acting students, Narayanan and Carnicke have collected hundreds of sequences for analysis and created a database they call the USC CreativeIT Database.

“It’s human data,” Narayanan explains. “What can we predict from these measurements? Can we develop a mathematical way of explaining patterns in human behavior?”

On one particular spring day, Carnicke supervises an exercise with two actors as Narayanan monitors a sophisticated motion capture (mocap) system, which collects data from tiny sensors embedded in the actors’ black spandex mocap suits.

“The resulting motion capture images make possible an intensely close analysis of what happens from moment to moment in the rehearsal hall,” says Carnicke. “It exposes the bones of the actors’ interactions.”

Carnicke has instructed her actors to improvise within specific constraints. She gives each exactly one action verb and one phrase they’re allowed to utter to achieve opposing objectives, using an improvisation technique called “active analysis,” invented by the Russian actor and director Konstantin Stanislavski.

Mocap 6
Thomas Krottinger, outfitted in motion capture sensors.
Outside of these verbal instructions, the sky’s the limit on physical interaction and expression. By controlling certain elements, the researchers can record data for other variables. And that’s where the real interest lies, says Carnicke.

“The words are fixed,” says Carnicke. “Everything else is based on manipulating expressive voice and body.”

Will Thomas reach for Rose’s arm? Will Rose eventually get frustrated and raise his voice? The conflict is easy to see, but how their interaction will play out is the real question.

As the actors move, the sensors record each movement. This carefully engineered data collection and annotation provide a method to quantify aspects of human communication and interaction.

The mocap technology is that same as that used on the sets of films such as Avatar and Shrek. But the purpose is different.

“Perhaps virtual humans or robots can eventually be designed to improvise upon themselves,” says Narayanan. “Not just deciding whether to do it — but how to do it, also.”

In the first year the professors used scenes from Shakespeare and Chekhov to draw out the actors’ improvisation. Next year, the researchers plan to use real life scenarios, such as how humans behave after waiting for 90 minutes in line at the DMV.

Carnicke noted that her experience working with Narayanan has led to “real discoveries” about how active analysis works for actors.

“Putting right- and left-brain people together for practice-based research has offered avenues of experimentation that would not have come up had I stayed only within my discipline,” says Carnicke.

Mocap 3
Viterbi School Ph.D. candidate Jeremy Chi-Chun Lee monitors the actors' digital representations on a computer screen.

Yet while the insights are real, Narayanan notes that quantitative understanding of human behavior is only a tool for scientific discovery. Using actors and acting techniques offers a solid baseline.

“But ultimately what matters are not the new insights they lead to about human behavior – but those that can be translated into useful applications,” says Narayanan. “These are ongoing research challenges.”

Potential applications span a number of domains that relate to behavior. They include addiction treatment, cognitive and behavioral therapy, customer care in business settings and global security applications where socio-cultural behaviors come into play.

Carnicke will be presenting a keynote speech based on the project at the Australian Dramatic Studies Conference in Canberra on June 30, 2010.