Multi-sensor fusion for efficient and robust UAV state estimation

Mahammad Irfan, Sagar Dalai, Kanishk Vishwakarma, Petar Trslic, James Riordan, Gerard Dooly

Research output: Chapter in Book/Report/Conference proceedingConference contribution

25 Downloads (Pure)

Abstract

Unmanned Aerial Vehicles (UAV’s) State estimation is fundamental aspect across a wide range of applications, including robot navigation, autonomous driving, virtual reality, and augmented reality (AR). The proposed research emphasizes the vital role of robust state estimation in ensuring the safe navigation of autonomous UAVs. In this paper, we developed an optimization-based odometry state estimation framework that is compatible with multiple sensor setups. Our evaluation of the system is conducted using inhouse integrated UAV platform outfitted with multiple sensors including stereo cameras, an IMU, LiDAR sensors and GPS-RTK for ground truth comparison. The algorithm delivers robust and consistent UAV state estimation in various conditions including illumination changes, feature or structure-less environment or even during degraded Global Positioning System (GPS) signals or total signal loss, where single sensor SLAM mostly fails. The experimental findings demonstrate that the proposed method is superior in compare to current state-of-the-art techniques.
Original languageEnglish
Title of host publication2024 12th International Conference on Control, Mechatronics and Automation (ICCMA)
PublisherIEEE
Pages35-40
Number of pages6
ISBN (Electronic)9798331517519
ISBN (Print)9798331517526, 9798331517502
DOIs
Publication statusPublished - 20 Jan 2025

Publication series

NameIEEE Conference Proceedings
PublisherIEEE
ISSN (Print)2837-5114
ISSN (Electronic)2837-5149

Keywords

  • robotics
  • state-estimation
  • UAV
  • odometry
  • sensor fusion
  • SLAM
  • ROS

Fingerprint

Dive into the research topics of 'Multi-sensor fusion for efficient and robust UAV state estimation'. Together they form a unique fingerprint.

Cite this