Sensor fusion for autonomous indoor UAV navigation in confined spaces

Research output: Chapter in Book/Report/Conference proceedingConference proceeding contributionpeer-review

Abstract

In this paper, we address the challenge of navigating through unknown indoor environments using autonomous aerial robots within confined spaces. The core of our system involves the integration of key sensor technologies, including depth sensing from the ZED 2i camera, IMU data, and LiDAR measurements, facilitated by the Robot Operating System (ROS) and RTAB-Map. Through custom designed experiments, we demonstrate the robustness and effectiveness of this approach. Our results showcase a promising navigation accuracy, with errors as low as 0.4 meters, and mapping quality characterized by a Root Mean Square Error (RMSE) of just 0.13 m. Notably, this performance is achieved while maintaining energy efficiency and balanced resource allocation, addressing a crucial concern in UAV applications. Flight tests further underscore the precision of our system in maintaining desired flight orientations, with a remarkable error rate of only 0.1%. This work represents a significant stride in the development of autonomous indoor UAV navigation systems, with potential applications in search and rescue, facility inspection, and environmental monitoring within GPS-denied indoor environments.

Original languageEnglish
Title of host publication2023 16th International Conference on Sensing Technology (ICST)
Place of PublicationPiscataway, NJ
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
Number of pages6
ISBN (Electronic)9798350395341
ISBN (Print)9798350395358
DOIs
Publication statusPublished - 2023
Event16th International Conference on Sensing Technology, ICST 2023 - Hyderabad, India
Duration: 17 Dec 202320 Dec 2023

Conference

Conference16th International Conference on Sensing Technology, ICST 2023
Country/TerritoryIndia
CityHyderabad
Period17/12/2320/12/23

Fingerprint

Dive into the research topics of 'Sensor fusion for autonomous indoor UAV navigation in confined spaces'. Together they form a unique fingerprint.

Cite this