Date of Award


Degree Type


Degree Name

Master of Arts (MA)



First Advisor

Diana L. Robins, PhD - Committee Chair

Second Advisor

David A. Washburn, PhD - Committee Member

Third Advisor

Erin B. Tone, PhD - Committee Member


The accurate integration of audio-visual emotion cues is critical for social interactions and requires efficient processing of facial cues. Gaze behavior of typically developing young adults was measured via eye-tracking during the perception of dynamic audio-visual emotion (DAVE) stimuli. Participants were able to identify basic emotions (angry, fearful, happy, neutral) and determine the congruence of facial expression and prosody. Perception of incongruent videos resulted in increased reaction times and emotion identification consistent with the facial expression. Participants consistently demonstrated a featural processing approach across all tasks, with a significant preference for the eyes. Evidence of hemispheric lateralization was indicated by preferential fixation to the left (happy, angry) or right eye (fearful). Fixation patterns differed according to the facially expressed emotion, with the pattern that emerged during fearful movies supporting the significance of automatic threat processing. Finally, fixation pattern during the perception of incongruent movies varied according to task instructions.


Included in

Psychology Commons