Audio-visual integration of emotional cues in song

William Forde Thompson, Frank A. Russo, Lena Quinto

Research output: Contribution to journalArticle

Abstract

We examined whether facial expressions of performers influence the emotional connotations of sung materials, and whether attention is implicated in audio-visual integration of affective cues. In Experiment 1, participants judged the emotional valence of audio-visual presentations of sung intervals. Performances were edited such that auditory and visual information conveyed congruent or incongruent affective connotations. In the single-task condition, participants judged the emotional connotation of sung intervals. In the dual-task condition, participants judged the emotional connotation of intervals while performing a secondary task. Judgements were influenced by melodic cues and facial expressions and the effects were undiminished by the secondary task. Experiment 2 involved identical conditions but participants were instructed to base judgements on auditory information alone. Again, facial expressions influenced judgements and the effect was undiminished by the secondary task. The results suggest that visual aspects of music performance are automatically and preattentively registered and integrated with auditory cues.

LanguageEnglish
Pages1457-1470
Number of pages14
JournalCognition and Emotion
Volume22
Issue number8
DOIs
StatePublished - 2008

Fingerprint

Facial Expression
Music
Cues
Emotion
Song
Hearing
Experiment
Affective

Cite this

Thompson, William Forde ; Russo, Frank A. ; Quinto, Lena. / Audio-visual integration of emotional cues in song. In: Cognition and Emotion. 2008 ; Vol. 22, No. 8. pp. 1457-1470
@article{5e17bbc04dfa448f922311b5f4b59f14,
title = "Audio-visual integration of emotional cues in song",
abstract = "We examined whether facial expressions of performers influence the emotional connotations of sung materials, and whether attention is implicated in audio-visual integration of affective cues. In Experiment 1, participants judged the emotional valence of audio-visual presentations of sung intervals. Performances were edited such that auditory and visual information conveyed congruent or incongruent affective connotations. In the single-task condition, participants judged the emotional connotation of sung intervals. In the dual-task condition, participants judged the emotional connotation of intervals while performing a secondary task. Judgements were influenced by melodic cues and facial expressions and the effects were undiminished by the secondary task. Experiment 2 involved identical conditions but participants were instructed to base judgements on auditory information alone. Again, facial expressions influenced judgements and the effect was undiminished by the secondary task. The results suggest that visual aspects of music performance are automatically and preattentively registered and integrated with auditory cues.",
author = "Thompson, {William Forde} and Russo, {Frank A.} and Lena Quinto",
year = "2008",
doi = "10.1080/02699930701813974",
language = "English",
volume = "22",
pages = "1457--1470",
journal = "Cognition and Emotion",
issn = "0269-9931",
publisher = "Elsevier",
number = "8",

}

Audio-visual integration of emotional cues in song. / Thompson, William Forde; Russo, Frank A.; Quinto, Lena.

In: Cognition and Emotion, Vol. 22, No. 8, 2008, p. 1457-1470.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Audio-visual integration of emotional cues in song

AU - Thompson,William Forde

AU - Russo,Frank A.

AU - Quinto,Lena

PY - 2008

Y1 - 2008

N2 - We examined whether facial expressions of performers influence the emotional connotations of sung materials, and whether attention is implicated in audio-visual integration of affective cues. In Experiment 1, participants judged the emotional valence of audio-visual presentations of sung intervals. Performances were edited such that auditory and visual information conveyed congruent or incongruent affective connotations. In the single-task condition, participants judged the emotional connotation of sung intervals. In the dual-task condition, participants judged the emotional connotation of intervals while performing a secondary task. Judgements were influenced by melodic cues and facial expressions and the effects were undiminished by the secondary task. Experiment 2 involved identical conditions but participants were instructed to base judgements on auditory information alone. Again, facial expressions influenced judgements and the effect was undiminished by the secondary task. The results suggest that visual aspects of music performance are automatically and preattentively registered and integrated with auditory cues.

AB - We examined whether facial expressions of performers influence the emotional connotations of sung materials, and whether attention is implicated in audio-visual integration of affective cues. In Experiment 1, participants judged the emotional valence of audio-visual presentations of sung intervals. Performances were edited such that auditory and visual information conveyed congruent or incongruent affective connotations. In the single-task condition, participants judged the emotional connotation of sung intervals. In the dual-task condition, participants judged the emotional connotation of intervals while performing a secondary task. Judgements were influenced by melodic cues and facial expressions and the effects were undiminished by the secondary task. Experiment 2 involved identical conditions but participants were instructed to base judgements on auditory information alone. Again, facial expressions influenced judgements and the effect was undiminished by the secondary task. The results suggest that visual aspects of music performance are automatically and preattentively registered and integrated with auditory cues.

UR - http://www.scopus.com/inward/record.url?scp=56349116339&partnerID=8YFLogxK

U2 - 10.1080/02699930701813974

DO - 10.1080/02699930701813974

M3 - Article

VL - 22

SP - 1457

EP - 1470

JO - Cognition and Emotion

T2 - Cognition and Emotion

JF - Cognition and Emotion

SN - 0269-9931

IS - 8

ER -