In this thesis a multi Visual Inertial Odometry (VIO) sensor fusion algorithm is implemented in order to provide an accurate and robust absolute localization of an Unmanned Aerial Vehicle in a confined space. Specifically, this work is part of a bigger project called Inspectrone, that tries to automatize inspection inside ballast tank of a ship by using Unmanned Aerial Vehicles (UAVs). In this environment no accurate GPS can be adopted due the fact that the ship will shield the signal coming from the satellites. The developed solution adopts cameras in order to provide a relative positioning reference for the drone. In this project, it is investigated different types of Extended Kalman Filter configuration, by implementing them in ROS (Robot Operating System), and examine how to merge data coming from multiple RealSense T265 cameras. In addition, a novel version of a quadcopter was built in order to test the sensor fusion algorithm, and compare it with the current variant of the drone used in the Inspectrone project. The results show that the proposed filter, is able to deal with faulty and malfunctioning sensors, always keeping the drone aware of its position in the space. Moreover, the algorithm is capable of outperforming the current relative position system employed in the initial version of the drone.

In this thesis a multi Visual Inertial Odometry (VIO) sensor fusion algorithm is implemented in order to provide an accurate and robust absolute localization of an Unmanned Aerial Vehicle in a confined space. Specifically, this work is part of a bigger project called Inspectrone, that tries to automatize inspection inside ballast tank of a ship by using Unmanned Aerial Vehicles (UAVs). In this environment no accurate GPS can be adopted due the fact that the ship will shield the signal coming from the satellites. The developed solution adopts cameras in order to provide a relative positioning reference for the drone. In this project, it is investigated different types of Extended Kalman Filter configuration, by implementing them in ROS (Robot Operating System), and examine how to merge data coming from multiple RealSense T265 cameras. In addition, a novel version of a quadcopter was built in order to test the sensor fusion algorithm, and compare it with the current variant of the drone used in the Inspectrone project. The results show that the proposed filter, is able to deal with faulty and malfunctioning sensors, always keeping the drone aware of its position in the space. Moreover, the algorithm is capable of outperforming the current relative position system employed in the initial version of the drone.

Safe localization and navigation for UAV-based inspection in confined spaces

TROVÒ, GIACOMO
2021/2022

Abstract

In this thesis a multi Visual Inertial Odometry (VIO) sensor fusion algorithm is implemented in order to provide an accurate and robust absolute localization of an Unmanned Aerial Vehicle in a confined space. Specifically, this work is part of a bigger project called Inspectrone, that tries to automatize inspection inside ballast tank of a ship by using Unmanned Aerial Vehicles (UAVs). In this environment no accurate GPS can be adopted due the fact that the ship will shield the signal coming from the satellites. The developed solution adopts cameras in order to provide a relative positioning reference for the drone. In this project, it is investigated different types of Extended Kalman Filter configuration, by implementing them in ROS (Robot Operating System), and examine how to merge data coming from multiple RealSense T265 cameras. In addition, a novel version of a quadcopter was built in order to test the sensor fusion algorithm, and compare it with the current variant of the drone used in the Inspectrone project. The results show that the proposed filter, is able to deal with faulty and malfunctioning sensors, always keeping the drone aware of its position in the space. Moreover, the algorithm is capable of outperforming the current relative position system employed in the initial version of the drone.
2021
Safe localization and navigation for UAV-based inspection in confined spaces
In this thesis a multi Visual Inertial Odometry (VIO) sensor fusion algorithm is implemented in order to provide an accurate and robust absolute localization of an Unmanned Aerial Vehicle in a confined space. Specifically, this work is part of a bigger project called Inspectrone, that tries to automatize inspection inside ballast tank of a ship by using Unmanned Aerial Vehicles (UAVs). In this environment no accurate GPS can be adopted due the fact that the ship will shield the signal coming from the satellites. The developed solution adopts cameras in order to provide a relative positioning reference for the drone. In this project, it is investigated different types of Extended Kalman Filter configuration, by implementing them in ROS (Robot Operating System), and examine how to merge data coming from multiple RealSense T265 cameras. In addition, a novel version of a quadcopter was built in order to test the sensor fusion algorithm, and compare it with the current variant of the drone used in the Inspectrone project. The results show that the proposed filter, is able to deal with faulty and malfunctioning sensors, always keeping the drone aware of its position in the space. Moreover, the algorithm is capable of outperforming the current relative position system employed in the initial version of the drone.
UAV
Inspection
Localization
Safe navigation
File in questo prodotto:
File Dimensione Formato  
Giacomo_Trovò.pdf

accesso aperto

Dimensione 57.59 MB
Formato Adobe PDF
57.59 MB Adobe PDF Visualizza/Apri

The text of this website © Università degli studi di Padova. Full Text are published under a non-exclusive license. Metadata are under a CC0 License

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.12608/31591