Preattentive integration of visual and auditory dimensions of music

William Forde Thompson, Frank A. Russo, Lena Quinto

Research output: Chapter in Book/Report/Conference proceedingConference proceeding contributionResearchpeer-review

Abstract

We examined whether visual and auditory cues to affect in music are integrated preattentively, beyond conscious control. In Experiment 1 participants judged the affective valence of audio-visual recordings of sung intervals. Performers sang major and minor intervals. Each interval was synchronized with facial expressions used to sing the same ‘happy’ or ‘sad’ interval (congruent condition) or a different interval (incongruent condition). Incongruent conditions involved audio and visual dimensions that conveyed conflicting affective connotations (e.g., positive audio, negative facial expression). In the single task condition, participants judged the affective connotation of the audiovisual performances. In the dual-task condition participants judged the affective connotation of performances while performing a secondary task. If conscious attention were needed to integrate visual and auditory cues, then integration should be reduced in the dual-task condition. Participants were influenced by visual cues when making affective judgments, but the secondary task did not influence judgments, suggesting that attentional resources were not involved in audio-visual integration. In Experiment 2, the same paradigm was utilized only participants were instructed to base affect ratings on auditory information alone (to ignore facial expressions). Results corroborated those observed in Experiment 1, confirming that audio-visual integration of affective cues in music occurs preattentively.
LanguageEnglish
Title of host publicationProceedings of the Second International Conference on Music and Gesture
EditorsAnthony Gritten, Elaine King
Place of PublicationHull, UK
PublisherGK Publishing
Pages217-221
Number of pages5
ISBN (Print)0955332907
Publication statusPublished - 2006
Externally publishedYes
EventInternational Conference on Music and Gesture (2nd : 2006) - Manchester, UK
Duration: 20 Jul 200623 Jul 2006

Conference

ConferenceInternational Conference on Music and Gesture (2nd : 2006)
CityManchester, UK
Period20/07/0623/07/06

Fingerprint

Music
Facial Expression
Cues

Cite this

Thompson, W. F., Russo, F. A., & Quinto, L. (2006). Preattentive integration of visual and auditory dimensions of music. In A. Gritten, & E. King (Eds.), Proceedings of the Second International Conference on Music and Gesture (pp. 217-221). Hull, UK: GK Publishing.
Thompson, William Forde ; Russo, Frank A. ; Quinto, Lena. / Preattentive integration of visual and auditory dimensions of music. Proceedings of the Second International Conference on Music and Gesture. editor / Anthony Gritten ; Elaine King. Hull, UK : GK Publishing, 2006. pp. 217-221
@inproceedings{efe616a1db6c4deeafc5f0725e72fc32,
title = "Preattentive integration of visual and auditory dimensions of music",
abstract = "We examined whether visual and auditory cues to affect in music are integrated preattentively, beyond conscious control. In Experiment 1 participants judged the affective valence of audio-visual recordings of sung intervals. Performers sang major and minor intervals. Each interval was synchronized with facial expressions used to sing the same ‘happy’ or ‘sad’ interval (congruent condition) or a different interval (incongruent condition). Incongruent conditions involved audio and visual dimensions that conveyed conflicting affective connotations (e.g., positive audio, negative facial expression). In the single task condition, participants judged the affective connotation of the audiovisual performances. In the dual-task condition participants judged the affective connotation of performances while performing a secondary task. If conscious attention were needed to integrate visual and auditory cues, then integration should be reduced in the dual-task condition. Participants were influenced by visual cues when making affective judgments, but the secondary task did not influence judgments, suggesting that attentional resources were not involved in audio-visual integration. In Experiment 2, the same paradigm was utilized only participants were instructed to base affect ratings on auditory information alone (to ignore facial expressions). Results corroborated those observed in Experiment 1, confirming that audio-visual integration of affective cues in music occurs preattentively.",
author = "Thompson, {William Forde} and Russo, {Frank A.} and Lena Quinto",
year = "2006",
language = "English",
isbn = "0955332907",
pages = "217--221",
editor = "Anthony Gritten and Elaine King",
booktitle = "Proceedings of the Second International Conference on Music and Gesture",
publisher = "GK Publishing",

}

Thompson, WF, Russo, FA & Quinto, L 2006, Preattentive integration of visual and auditory dimensions of music. in A Gritten & E King (eds), Proceedings of the Second International Conference on Music and Gesture. GK Publishing, Hull, UK, pp. 217-221, International Conference on Music and Gesture (2nd : 2006), Manchester, UK, 20/07/06.

Preattentive integration of visual and auditory dimensions of music. / Thompson, William Forde; Russo, Frank A.; Quinto, Lena.

Proceedings of the Second International Conference on Music and Gesture. ed. / Anthony Gritten; Elaine King. Hull, UK : GK Publishing, 2006. p. 217-221.

Research output: Chapter in Book/Report/Conference proceedingConference proceeding contributionResearchpeer-review

TY - GEN

T1 - Preattentive integration of visual and auditory dimensions of music

AU - Thompson, William Forde

AU - Russo, Frank A.

AU - Quinto, Lena

PY - 2006

Y1 - 2006

N2 - We examined whether visual and auditory cues to affect in music are integrated preattentively, beyond conscious control. In Experiment 1 participants judged the affective valence of audio-visual recordings of sung intervals. Performers sang major and minor intervals. Each interval was synchronized with facial expressions used to sing the same ‘happy’ or ‘sad’ interval (congruent condition) or a different interval (incongruent condition). Incongruent conditions involved audio and visual dimensions that conveyed conflicting affective connotations (e.g., positive audio, negative facial expression). In the single task condition, participants judged the affective connotation of the audiovisual performances. In the dual-task condition participants judged the affective connotation of performances while performing a secondary task. If conscious attention were needed to integrate visual and auditory cues, then integration should be reduced in the dual-task condition. Participants were influenced by visual cues when making affective judgments, but the secondary task did not influence judgments, suggesting that attentional resources were not involved in audio-visual integration. In Experiment 2, the same paradigm was utilized only participants were instructed to base affect ratings on auditory information alone (to ignore facial expressions). Results corroborated those observed in Experiment 1, confirming that audio-visual integration of affective cues in music occurs preattentively.

AB - We examined whether visual and auditory cues to affect in music are integrated preattentively, beyond conscious control. In Experiment 1 participants judged the affective valence of audio-visual recordings of sung intervals. Performers sang major and minor intervals. Each interval was synchronized with facial expressions used to sing the same ‘happy’ or ‘sad’ interval (congruent condition) or a different interval (incongruent condition). Incongruent conditions involved audio and visual dimensions that conveyed conflicting affective connotations (e.g., positive audio, negative facial expression). In the single task condition, participants judged the affective connotation of the audiovisual performances. In the dual-task condition participants judged the affective connotation of performances while performing a secondary task. If conscious attention were needed to integrate visual and auditory cues, then integration should be reduced in the dual-task condition. Participants were influenced by visual cues when making affective judgments, but the secondary task did not influence judgments, suggesting that attentional resources were not involved in audio-visual integration. In Experiment 2, the same paradigm was utilized only participants were instructed to base affect ratings on auditory information alone (to ignore facial expressions). Results corroborated those observed in Experiment 1, confirming that audio-visual integration of affective cues in music occurs preattentively.

M3 - Conference proceeding contribution

SN - 0955332907

SP - 217

EP - 221

BT - Proceedings of the Second International Conference on Music and Gesture

A2 - Gritten, Anthony

A2 - King, Elaine

PB - GK Publishing

CY - Hull, UK

ER -

Thompson WF, Russo FA, Quinto L. Preattentive integration of visual and auditory dimensions of music. In Gritten A, King E, editors, Proceedings of the Second International Conference on Music and Gesture. Hull, UK: GK Publishing. 2006. p. 217-221