Date of Award
Master of Arts (MA)
Diana L. Robins, PhD - Committee Chair
David A. Washburn, PhD - Committee Member
Erin B. Tone, PhD - Committee Member
The accurate integration of audio-visual emotion cues is critical for social interactions and requires efficient processing of facial cues. Gaze behavior of typically developing young adults was measured via eye-tracking during the perception of dynamic audio-visual emotion (DAVE) stimuli. Participants were able to identify basic emotions (angry, fearful, happy, neutral) and determine the congruence of facial expression and prosody. Perception of incongruent videos resulted in increased reaction times and emotion identification consistent with the facial expression. Participants consistently demonstrated a featural processing approach across all tasks, with a significant preference for the eyes. Evidence of hemispheric lateralization was indicated by preferential fixation to the left (happy, angry) or right eye (fearful). Fixation patterns differed according to the facially expressed emotion, with the pattern that emerged during fearful movies supporting the significance of automatic threat processing. Finally, fixation pattern during the perception of incongruent movies varied according to task instructions.
McManus, Susan M., "Gaze Fixation during the Perception of Visual and Auditory Affective Cues." Thesis, Georgia State University, 2009.