Date of Award

Summer 8-2012

Degree Type

Dissertation

Degree Name

Doctor of Philosophy (PhD)

Department

Psychology

First Advisor

Diana L. Robins, Ph.D.

Second Advisor

Erin B. Tone, Ph.D.

Third Advisor

David J. Marcus, Ph.D

Fourth Advisor

Robert D. Latzman, Ph.D.

Abstract

The accurate integration of audio-visual emotion cues is critical for social interactions and requires efficient processing of facial cues. Gaze behavior of typically developing (TD) individuals and individuals with autism spectrum disorders (ASD) was measured via eye-tracking during the perception of dynamic audio-visual emotion (DAVE) stimuli. This study provides information about the regions of the face sampled during an emotion perception task that is relatively more complex than those used in previous studies, providing both bimodal (auditory and visual) and dynamic (biological motion) cues. Results indicated that the ASD group was less accurate at emotion detection and demonstrated less of a visual-affective bias than TD individuals. Both groups displayed similar fixation patterns across regions during the perception of congruent audio-visual stimuli. However, between-group analyses revealed that fixation patterns differed significantly by facial regions during the perception of both congruent and incongruent movies together. In addition, fixation duration to critical regions (i.e., face, core, eyes) was negatively correlated with measures of ASD symptomatology and social impairment. Findings suggest weaknesses in the early integration of audio-visual information, automatic perception of emotion, and efficient detection of affective conflict in individuals with ASD. Implications for future research and social skills intervention programs are discussed.

DOI

https://doi.org/10.57709/3098238

Share

COinS