Logo: University of Southern California

Events Calendar



Select a calendar:



Filter May Events by Event Type:


SUNMONTUEWEDTHUFRISAT
7
10
11
12
13

21
23
24
25
26
27

28
29
31
1
2
3


Events for May 04, 2023

  • PHD Defense - Su Lei

    Thu, May 04, 2023 @ 01:00 PM - 03:00 PM

    Thomas Lord Department of Computer Science

    University Calendar


    PHD Defense: Su Lei

    Committee: Jonathan Gratch (Chair), Laurent Itti, Shri Narayanan

    Abstract: In this dissertation, I innovate automatic facial analysis methods and use them to yield fundamental insights into the source and function of facial expressions in face-to-face social interaction. Facial expressions play an essential role in shaping human social behavior. The ability to accurately recognize, interpret and respond to emotional expressions is a hallmark of human social intelligence, and automating this ability is a key focus of computer science research. Machines that possess this skill could enhance the capabilities of human-machine interfaces, help diagnose social disorders, improve predictive models of human behavior, or serve as methodological tools in social science research. My dissertation focuses on this last application. Specifically, I examine two competing perspectives on the social meaning of facial expressions and show that automated methods can yield novel insights. In terms of technical innovation, I develop novel methods to interpret the meaning of facial expressions in terms of facial expressivity. Within computer science, facial expression analysis has been heavily influenced by the "basic emotion theory" which claims that expressions reflect the activation of a small number of discrete emotions (e.g., joy, hope, or fear). Thus, automatic emotion recognition methods seek to classify facial displays into these discrete categories to form insights into how an individual is interpreting a situation and what they will do next. However, more recent psychological findings have largely discredited this theory, highlighting that people show a wide range of idiosyncratic expressions in response to the same event. Motivated by this more recent research, I develop supervised machine learning models to automatically measure perceived expressivity from video data. In terms of theoretical innovation, I demonstrate how automatic expressivity recognition yields insight into alternative psychological theories on the nature of emotional expressions in social tasks by analyzing a large corpus of people engaged in the iterated prisoner's dilemma task. This is a canonical task used to test theories of social cognition and the function of facial expressions. First, I explore the appraisal perspective which claims that expressions reflect an individual's appraisal of how actions within a social task relate to their goals. I find that by analyzing facial expressions produced by participants, a computer can reliably predict how actions in the task impact participants' appraisals (specifically, we predict if the action was unexpected). Further, we show that automatic expressivity recognition dramatically improves the accuracy of these predictions over traditional emotion recognition. This lends support to the theory that expressions are, in a sense, directly caused by the social task. Second, I explore a contrasting perspective, interpersonal-dynamics theory, which argues that expressions are, in a sense, directly caused by the partner's expressions. This perspective emphasizes processes such as synchrony, mimicry, and contagion to explain moment-to-moment expressions. The appraisal perspective counters that any observed synchrony simply reflects a shared appraisal of social actions. I use automatic expressivity recognition to contrast these perspectives. Specifically, I analyze synchrony in two experimental conditions: a "still" condition where dyads see only a still image of their partner, and a "video" condition with real-time visual access to their partner's facial reactions. Using Dynamic Time Warping, I evaluate synchrony in both real and randomly paired dyads. Results reveal that synchrony exists even without visual cues, suggesting that shared appraisals contribute to synchrony, but that synchrony significantly increases when the partner is visible. This suggests that both perspectives must be integrated to best explain facial displays. In conclusion, both appraisal and interpersonal-dynamics perspectives reinforce the significance of emotional expressivity in interpreting facial displays and fostering social coordination in cooperative and competitive contexts. These insights offer valuable contributions to affective computing and the understanding of social interaction mechanisms. I also discuss potential limitations and future research directions for further exploring the complexities of social interactions.

    Location: https://usc.zoom.us/j/6448851979?pwd=TThsRC96Vk9KZnVLV0RIc1g5NGVuQT09

    Audiences: Everyone Is Invited

    Contact: Asiroh Cham

    OutlookiCal