I know your next move: action decisions in dyadic pick and place tasks

Diana Babajanyan, Gaurav Patil, Maurice Lamb, Rachel W. Kallen, Michael J. Richardson

Research output: Chapter in Book/Report/Conference proceedingConference proceeding contributionpeer-review

1 Downloads (Pure)

Abstract

Joint pick and place tasks occur in many interpersonal scenarios, such as when two people pick up and pass dishes. Previous studies have demonstrated that low-dimensional models can accurately capture the dynamics of pick and place motor behaviors in a controlled 2D environment. The current study models the dynamics of pick-up and pass decisions within a less restrictive virtual reality mediated 3D joint pick and place task. Findings indicate that reach-normalized distance measures, between participants and objects/targets, could accurately predict pick-up and pass decisions. Findings also reveal that participants took longer to pick-up objects where division of labor boundaries were less obvious and tended to pass in locations maximizing the dyad’s efficiency. This study supports the notion that individuals are more likely to engage in interpersonal behavior when a task goal is perceived as difficult or unattainable (i.e., not afforded). Implications of findings for human-artificial agent interactions are discussed.
Original languageEnglish
Title of host publicationCogSci2022
Subtitle of host publicationproceedings of the 44th Annual Conference of the Cognitive Science Society
EditorsJennifer Culbertson, Andrew Perfors, Hugh Rabagliati, Veronica Ramenzoni
Place of PublicationAustin, Texas
PublisherCognitive Science Society
Pages563-570
Number of pages8
Publication statusPublished - 2022
EventAnnual Meeting of the Cognitive Science Society (44th : 2022) - Toronto, Canada
Duration: 27 Jul 202230 Jul 2022

Publication series

NameProceedings of the Annual Meeting of the Cognitive Science Society
Volume44
ISSN (Electronic)1069-7977

Conference

ConferenceAnnual Meeting of the Cognitive Science Society (44th : 2022)
Abbreviated titleCogSci 2022
Country/TerritoryCanada
CityToronto
Period27/07/2230/07/22

Bibliographical note

Copyright the Author(s) 2022. Version archived for private and non-commercial use with the permission of the author/s and according to publisher conditions. For further rights please contact the publisher.

Keywords

  • affordances
  • joint action
  • pick and place tasks
  • decision making
  • virtual reality

Fingerprint

Dive into the research topics of 'I know your next move: action decisions in dyadic pick and place tasks'. Together they form a unique fingerprint.

Cite this