Human but not robotic gaze facilitates action prediction

Emmanuele Tidoni*, Henning Holle, Michele Scandola, Igor Schindler, Loron Hill, Emily S Cross

*Corresponding author for this work

    Research output: Contribution to journalArticlepeer-review

    3 Citations (Scopus)
    15 Downloads (Pure)

    Abstract

    Do people ascribe intentions to humanoid robots as they would to humans or non-human-like animated objects? In six experiments, we compared people's ability to extract non-mentalistic (i.e., where an agent is looking) and mentalistic (i.e., what an agent is looking at; what an agent is going to do) information from gaze and directional cues performed by humans, human-like robots, and a non-human-like object. People were faster to infer the mental content of human agents compared to robotic agents. Furthermore, although the absence of differences in control conditions rules out the use of non-mentalizing strategies, the human-like appearance of non-human agents may engage mentalizing processes to solve the task. Overall, results suggest that human-like robotic actions may be processed differently from humans' and objects' behavior. These findings inform our understanding of the relevance of an object's physical features in triggering mentalizing abilities and its relevance for human-robot interaction.

    Original languageEnglish
    Article number104462
    Pages (from-to)1-26
    Number of pages26
    JournaliScience
    Volume25
    Issue number6
    DOIs
    Publication statusPublished - 17 Jun 2022

    Bibliographical note

    Copyright the Author(s) 2022. Version archived for private and non-commercial use with the permission of the author/s and according to publisher conditions. For further rights please contact the publisher.

    Keywords

    • cognitive neuroscience
    • robotics
    • research methodology social sciences

    Fingerprint

    Dive into the research topics of 'Human but not robotic gaze facilitates action prediction'. Together they form a unique fingerprint.

    Cite this