A Sensorized garment controlled virtual robotic wheelchair

Tauseef Gulrez, Alessandro Tognetti

Research output: Contribution to journalArticlepeer-review

12 Citations (Scopus)

Abstract

This paper presents the design and performance of a body-machine-interface (BoMI) system, where a user controls a robotic 3D virtual wheelchair with the signals derived from his/her shoulder and elbow movements. BoMI promotes the perspective that system users should no longer be operators of the engineering design but should be an embedded part of the functional design. This BoMI system has real-time controllability of robotic devices based on user-specific dynamic body response signatures in high-density 52-channel sensor shirt. The BoMI system not only gives access to the user's body signals, but also translates these signals from user's body to the virtual reality device-control space. We have explored the efficiency of this BoMI system in a semi-cylinderic 3D virtual reality system. Ex-perimental studies are conducted to demonstrate, how this transformation of human body signals of multiple degrees of freedom, controls a robotic wheelchair navigation task in a 3D virtual reality environment. We have also presented how machine learning can enhance the interface to adapt towards the degree of freedoms of human body by correcting the errors performed by the user.
Original languageEnglish
Pages (from-to)847-868
Number of pages22
JournalJournal of intelligent and robotic systems
Volume74
Issue number3-4
DOIs
Publication statusPublished - Jun 2014

Keywords

  • Robotics
  • Virtual reality
  • Wearable sensors

Fingerprint

Dive into the research topics of 'A Sensorized garment controlled virtual robotic wheelchair'. Together they form a unique fingerprint.

Cite this