Audio-visual integration of emotional cues in song

William Forde Thompson*, Frank A. Russo, Lena Quinto

*Corresponding author for this work

    Research output: Contribution to journalArticlepeer-review

    62 Citations (Scopus)

    Abstract

    We examined whether facial expressions of performers influence the emotional connotations of sung materials, and whether attention is implicated in audio-visual integration of affective cues. In Experiment 1, participants judged the emotional valence of audio-visual presentations of sung intervals. Performances were edited such that auditory and visual information conveyed congruent or incongruent affective connotations. In the single-task condition, participants judged the emotional connotation of sung intervals. In the dual-task condition, participants judged the emotional connotation of intervals while performing a secondary task. Judgements were influenced by melodic cues and facial expressions and the effects were undiminished by the secondary task. Experiment 2 involved identical conditions but participants were instructed to base judgements on auditory information alone. Again, facial expressions influenced judgements and the effect was undiminished by the secondary task. The results suggest that visual aspects of music performance are automatically and preattentively registered and integrated with auditory cues.

    Original languageEnglish
    Pages (from-to)1457-1470
    Number of pages14
    JournalCognition and Emotion
    Volume22
    Issue number8
    DOIs
    Publication statusPublished - 2008

    Fingerprint Dive into the research topics of 'Audio-visual integration of emotional cues in song'. Together they form a unique fingerprint.

    Cite this