Precision position tracking in virtual reality environments using sensor networks

Tauseef Gulrez*, Manolya Kavakli

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference proceeding contributionpeer-review

11 Citations (Scopus)

Abstract

In an immersive interactive virtual reality (VR) environment a real human can be incorporated into a virtual 3D scene to navigate a robotic device within that virtual scene. This has useful applications in rehabilitation. The non-destructive nature of VR makes it an ideal testbed for many applications and a prime candidate for use in rehabilitation robotics simulation. The key challenge is to accurately localise the movement of the object in reality and map its corresponding position in 3D VR. To solve the localisation problem we have formed an online mode vision sensor network, which tracks the object's real Euclidean position and sends the information back to the VR scene. A precision position tracking (PPT) system has been installed to track the object. We have previously presented the solution to the sensor relevance establishment problem (in [10], [11]) where from a group of sensors the most relevant sensing action is obtained. In this paper we apply the same technique to the VR system. The problem can be broken down in two steps. In step one, the relevant sensor type is discovered based upon the IEEE 1451.4 Transducers Electronic Data Sheets (TEDS) description model. TEDS is used to discover the sensor types, their geographical locations, and additional information such as uncertainty measurement functions and information fusion rules necessary to fuse multi-sensor data. In step two, the most useful sensor information is obtained using the Kullback-Leibler Divergence (KLD) method. In this study we conduct two experiments that address the localisation problem. In the first experiment a VR 3D environment is created using the real-time distributed robotics software Player/Stage/Gazebo and a simulated PPT camera system is used to localise a simulated autonomous mobile robot within the 3D environment. In the second experiment, a real user is placed in a cave-like VR 3D environment and a real PPT camera system is used to localise the user's physical actions in reality. The physical actions of the real user are then used to control the robotic device in VR.

Original languageEnglish
Title of host publication2007 IEEE International Symposium on Industrial Electronics, ISIE 2007, Proceedings
Place of PublicationPiscataway, N.J
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
Pages1997-2003
Number of pages7
ISBN (Print)1424407559, 9781424407552
DOIs
Publication statusPublished - 2007
Event2007 IEEE International Symposium on Industrial Electronics, ISIE 2007 - Caixanova - Vigo, Spain
Duration: 4 Jun 20077 Jun 2007

Other

Other2007 IEEE International Symposium on Industrial Electronics, ISIE 2007
Country/TerritorySpain
CityCaixanova - Vigo
Period4/06/077/06/07

Fingerprint

Dive into the research topics of 'Precision position tracking in virtual reality environments using sensor networks'. Together they form a unique fingerprint.

Cite this