Logo: University of Southern California

Intel Fellow Creating a System to Automatically Recognize Emotion

Emily Mower's research seeks to understand how machines can interpret emotion.
Jes Erberich
October 25, 2009 —

Emily Mower's research seeks to understand how machines can interpret emotion.

Intel Fellow Emily Mower's research evaluating human perception of synthetic character emotion may lead to robots with social intelligence.
A graduate student in the USC Viterbi School's Ming Hsieh Department of Electrical Engineering, Intel Fellow Emily Mower's research may allow synthetic agents (robots and computer avatars) to understand human expressions of emotion.

Mower is an Intel Fellow, a recipient of one of the highly competitive grants awarded by Intel Corporation.. She presented her most recent findings at the 10th Annual Conference of the International Speech Communication Association held in Brighton, UK.

Mower is pursuing a complex model that takes into account not only the emotional properties of the audio/visual material presented but the range of potential variables in the human subject’s own emotional state, attention level, context, and other factors.

Mower’s latest paper, “Evaluating Evaluators: A Case Study in Understanding the Beneļ¬ts and Pitfalls of Multi-Evaluator Modeling” addresses the difficulty of extracting meaningful conclusions from the exhaustive number of variables that can affect a subject’s emotional classification of audio/visual input.

She concluded from her data that the most effective research process may be to average the reported evaluations from multiple subjects, rather than trying to predict the perception of individuals.

Her work is in collaboration with two labs at USC, the Signal Analysis and Interpretation Lab (SAIL, directed by Shrikanth Narayanan) and the Interaction Lab (directed by Maja Mataric').  Both Narayanan and Mataric' are co-authors on the paper.

The Emotion Group, a subgroup of SAIL, is focused on areas of research ranging from the analysis of affective speech and facial movement, to the expressions of emotions in text.  The group is also developing models to explain the dynamics of the communication patterns that exist between individuals. 

The Interaction Lab focuses on the development of assitive interactive robots to help people.  CRES co-director Maja J. Mataric' coined the term Socially Assistive Robots to refer to robots that are designed to work with patients who have cognitive or physical disabilities. They serve as mechanical therapists, verbally encouraging and physically assisting individuals to exercise weak muscles or social abilities.

Mower's research is at the crossroads of the two labs.  Her work seeks to further the understanding of affective communication by understanding how people make affective assessments.  The implementation of this knowledge can be used to foster an increase in the social intelligence of both robots and synthetic characters.

Her thesis title is "Creating Human-Centric Expressive Interfaces: Linking Perceptual Evaluations to the Engineering Design of Synthetic Multimodal Communication." She was also co-author of another paper on emotion recognition, "Emotion recognition using a hierarchical binary decision tree approach" (Chi-Chun Lee et al., 2009), presented at Interspeech, that won the Emotion Challenge.

The Intel Fellows grant covers tuition, a stipend,  and connection with an Intel technical leader working in the student’s area of study, including a travel grant. More information about the program and other Fellows is at http://blogs.intel.com/research/2009/09/and_the_winners_are.php

Before her Intel Fellowship, Mower received the National Science Foundation Graduate Research Fellowship (2004–2007), and the Herbert Kunzel Engineering Fellowship from USC (2007–2008),

Besides Intel, the NSF, and U.S. Army provided funding for the paper presented in Brighton.