Logo: University of Southern California

On the Spectrum

Interactive tools capturing behavior data show promise in understanding autism
By: Alana Klein Prisco
November 14, 2012 —
Dr. Shrikanth Narayanan

The increasing prevalence of autism — one in 88 children has some form of it — and the growing uncertainty about what causes it, has raised many questions about the current methods of diagnosis. Shrikanth Narayanan, the Andrew J. Viterbi Professor of Engineering at the USC Viterbi School of Engineering and founder of the Signal Analysis and Interpretation Lab (SAIL) at USC, set out to answer some of these questions and discover new and reliable ways to help understand, diagnose and ultimately treat Autism.

More broadly known as Autism Spectrum Disorders (ASD), this neurogenetic condition impairs one’s ability to communicate, respond to his or her surroundings and form relationships with others. Autism can be diagnosed as early as 18 months of age, and early intervention is key. But diagnosing this disorder isn’t always straightforward.

“It’s very easy for experts to diagnose autism when symptoms are presenting themselves, but there are a lot of challenges in not just determining whether a child is on the spectrum but also in stratifying their condition,” Narayanan says. Because it is a heterogeneous disorder (caused by multiple mutations from different origins) with great variability in the severity of symptoms in each patient, he says there isn’t a one-size-fits-all approach to understanding autism.

The standard practice for diagnosis currently relies heavily on observational methods. Narayanan and fellow researchers at the USC Viterbi School of Engineering are paving the path for a new quantitative approach to human behavior analysis that involves interactive technologies and analytics. “We’re trying to better understand human behavioral cues and the abstract aspects of human behavior that are central to how we diagnose and intervene,” Narayanan says.

He is referring to what is known as behavioral informatics, an emerging field that aims to quantify and interpret human interaction and communication through engineering and computing innovations. Specifically, Narayanan’s research focuses on behavioral signal processing, the use of technology and algorithms to quantitatively and objectively understand typical and atypical human behavior.

 

The Electrodermal response (EDA) is indicative of socio-cognitive load while the Verbal Response Latency (VRL) indicative of a child’s uncertainty, hesitation and stress levels. Behavioral signal processing shows
both child and parent EDA cues are reflective of underlying socio-cognitive levels, and parents tend to synchronize with their children depending on the child’sability to engage in task.
[Ref. Theodora Chaspari, Chi-Chun Lee, and Shrikanth Narayanan. Interplay between verbal response latency and physiology of children with autism during ECA interactions. In Proceedings of InterSpeech, 2012.]

“Behavioral signal processing promises to improve the efficiency and accuracy of human behavior analysis and modeling and create new tools for scientific discovery,” he says.

Through the use of technologies that pick up audio, visual and physiological signals, behavioral signal processing can provide a consistent and controlled evaluation of a child’ s communicative and social-emotional abilities. It can also detect both verbal and non-verbal behavior, which is considered crucial for diagnosis.

“Children have trouble expressing their emotions so if we can access their inner affective state we can get better insight into their answers,” says Theodora Chaspari, a third-year doctoral student studying electrical engineering at USC’s Viterbi School of Engineering.

Technologies that can aid in diagnosis include embodied conversational agents (ECAs), which rely on human-computer interaction. ECAs are computer-generated characters that have many of the same properties that humans do. They are incorporated into what appears to the child as a computer game. While they may look like cartoon characters, they do more than entertain. They can actually elicit and analyze social interaction. For example, they can characterize a child’s vocal and facial expressions and “determine if a smile or laughter are reactions to something joyful or whether these expressions are unintentional or awkward," according to Narayanan.

Through the use of sensors worn by the child, EDA refers to changes measured at the surface of the skin that are caused by emotion, cognition or attention. EDA can detect sweat on a child’s wrist and ankle. “We look to see if EDA levels go up when we ask questions that require a more elaborate answer or higher cognitive load,” says Chaspari.

“The goal is to get insight into behavior that we can’t get by observation only and to capture and decode what is hidden from us. That is the biggest challenge,” Narayanan says. He is quick to point out that behavioral informatics does not seek to replace clinical expertise, but to support it. “We collaborate with psychologists to enhance their decision making capabilities by offering details about underlying complex behavior,” he says.

For example, if an expert deems a patient’s speech atypical, behavioral informatics can offer quantitative details about voice quality, pitch and timing in speech that could explain this perceived atypicality. He says behavioral informatics can also analyze physiological signals, such as heart rate and EDA, to lend further insights into the psychological state of the child.

“Such collaboration is essential both to help define and contextualize the right set of research questions, and to be consumers of behavioral informatics,” he says. He adds that the analytical capabilities of behavioral informatics can allow for comparisons over time and the ability to track patients’ responses to specific treatments.

Narayanan and colleagues are busy spreading the word about these new approaches and tools. He recently led a national workshop on engineering and autism geared to researchers and engineers that was hosted at USC with the support of the university’s Ming Hsieh Institute in Electrical Engineering. The event featured leading autism researchers, who spoke about early detection and intervention advances, and engineering researchers, who highlighted advances in behavioral signal processing, computing and behavioral informatics.

“We are still in the very early stages with these methodologies, but their promise is very exciting,” Narayanan says.