Detecting communicative intent in a computerised test of joint attention

Nathan Caruana*, Genevieve McArthur, Alexandra Woolgar, Jon Brock

*Corresponding author for this work

Research output: Contribution to journalArticle

8 Citations (Scopus)
11 Downloads (Pure)

Abstract

The successful navigation of social interactions depends on a range of cognitive faculties-including the ability to achieve joint attention with others to share in- formation and experiences. We investigated the influence that intention monitoring processes have on gaze-following response times during joint attention. We employed a virtual reality task in which 16 healthy adults engaged in a collaborative game with a virtual partner to locate a target in a visual array. In the Search task, the virtual partner was programmed to engage in non-communicative gaze shifts in search of the target, establish eye contact, and then display a communicative gaze shift to guide the participant to the target. In the NoSearch task, the virtual partner simply established eye contact and then made a single communicative gaze shift towards the target (i.e., there were no non-communicative gaze shifts in search of the target). Thus, only the Search task required participants to monitor their partner's communicative intent before responding to joint attention bids. We found that gaze following was significantly slower in the Search task than the NoSearch task. However, the same effect on response times was not observed when participants completed non-social control versions of the Search and NoSearch tasks, in which the avatar's gaze was replaced by arrow cues. These data demonstrate that the intention monitoring processes involved in differentiating communicative and non-communicative gaze shifts during the Search task had a measurable influence on subsequent joint attention behaviour. The empirical and methodological implications of these findings for the fields of autism and social neuroscience will be discussed.

Original languageEnglish
Article numbere2899
Pages (from-to)1-16
Number of pages16
JournalPeerJ
Volume5
DOIs
Publication statusPublished - 17 Jan 2017

    Fingerprint

Bibliographical note

Copyright the Author(s) 2017. Version archived for private and non-commercial use with the permission of the author/s and according to publisher conditions. For further rights please contact the publisher.

Keywords

  • Eye gaze
  • Eye-tracking
  • Joint attention
  • Mentalising
  • Social interaction
  • Virtual reality

Cite this