Viewing and feeling touch modulates hand position for reaching

Regine Zopf*, Sandra Truong, Matthew Finkbeiner, Jason Friedman, Mark A. Williams

*Corresponding author for this work

    Research output: Contribution to journalArticlepeer-review

    34 Citations (Scopus)

    Abstract

    Action requires knowledge of our body location in space. Here we asked if interactions with the external world prior to a reaching action influence how visual location information is used. We investigated if the temporal synchrony between viewing and feeling touch modulates the integration of visual and proprioceptive body location information for action. We manipulated the synchrony between viewing and feeling touch in the Rubber Hand Illusion paradigm prior to participants performing a ballistic reaching task to a visually specified target. When synchronous touch was given, reaching trajectories were significantly shifted compared to asynchronous touch. The direction of this shift suggests that touch influences the encoding of hand position for action. On the basis of this data and previous findings, we propose that the brain uses correlated cues from passive touch and vision to update its own position for action and experience of self-location.

    Original languageEnglish
    Pages (from-to)1287-1293
    Number of pages7
    JournalNeuropsychologia
    Volume49
    Issue number5
    DOIs
    Publication statusPublished - Apr 2011

    Keywords

    • parietal cortex
    • touch
    • action
    • human body
    • body location
    • rubber hand illusion

    Fingerprint Dive into the research topics of 'Viewing and feeling touch modulates hand position for reaching'. Together they form a unique fingerprint.

    Cite this