Temporal relation between speech and co-verbal iconic gestures in multimodal interface design

Jing Liu, Manolya Kavakli

Research output: Chapter in Book/Report/Conference proceedingConference proceeding contributionpeer-review

Abstract

An important issue, in order to build multimodal interfaces that enable the user to communicate with computer systems in a more natural manner, especially using speech and gestures, is how to relate speech with co-verbal gestures. This paper investigates how speech and iconic gestures are related with each other at lexical level. At lexical level we study the timing relationships of iconic hand gestures and the related words in speech. We also try to find common hand gestures used in object descriptions. Our experimental results show that the overwhelming majority of hand gestures in object descriptions precede the related words in users’ speech within 2 seconds at lexical level. The results also indicate that people use some similar gestures when they describe the same parts of the objects.
Original languageEnglish
Title of host publicationGESPIN2011 Proceedings
PublisherGESPIN Committee
Number of pages5
Publication statusPublished - 2011
EventGesture and Speech in Interaction Conference - Bielefeld, Germany
Duration: 5 Sept 20117 Sept 2011

Conference

ConferenceGesture and Speech in Interaction Conference
CityBielefeld, Germany
Period5/09/117/09/11

Keywords

  • speech
  • co-verbal iconic gesture
  • temporal relation

Fingerprint

Dive into the research topics of 'Temporal relation between speech and co-verbal iconic gestures in multimodal interface design'. Together they form a unique fingerprint.

Cite this