Abstract
With the development of Multimodal Interfaces (MMI) in Human Computer Interaction (HCI), there is an increasing interest at applying this technology to multimodal web interaction. Multimodal web interfaces can provide endusers with a natural, flexible and non-invasive interface that allow graphical, vocal and gestural interaction with web. Integration of speech and gestures in an MMI framework is now the focus of the researchers in this area. In order to combine speech and gestures in multimodal web interaction, it is essential to know the correlations between speech and associated gestures. This paper presents an empirical study aimed at studying the correlations between speech and hand gestures from a cognitive aspect. The methodology used in this paper is the video analysis to investigate the cognitive actions of speakers in the descriptions of objects using speech and hand gestures. The speakers' cognitive actions are analyzed using a cognitive scheme and protocol analysis
method. Our initial findings suggest that speech is highly correlated with co-verbal hand gestures perceptually and semantically, regardless of the age, gender, background of the speakers, or the speed of speech and gesticulation.
Original language | English |
---|---|
Title of host publication | WORLDCOMP 2013 |
Subtitle of host publication | Proceedings of the 2013 World Congress in Computer Science, Computer Engineering, and Applied Computing |
Place of Publication | San Diego, CA |
Publisher | WorldComp |
Pages | 1-6 |
Number of pages | 6 |
Publication status | Published - 2013 |
Event | World Congress in Computer Science, Computer Engineering, and Applied Computing - Las Vegas, NV Duration: 22 Jul 2013 → 25 Jul 2013 |
Conference
Conference | World Congress in Computer Science, Computer Engineering, and Applied Computing |
---|---|
City | Las Vegas, NV |
Period | 22/07/13 → 25/07/13 |
Keywords
- multimodal web interaction
- speech
- co-verbal hand gestures
- cognitive actions