This thesis presents the development of an RGB–ToF perception pipeline for the real-time estimation of the inclination of the landing surface beneath the quadrotor. The system integrates a multi-zone Time-of-Flight sensor, responsible for the metric perception of depth, and a high-resolution RGB camera, which provides the photometric component of the scene. The two devices were rigidly mounted on a dedicated support, designed in CAD and fabricated via 3D printing, and were subjected to a complete calibration procedure: intrinsic calibration of the RGB camera and extrinsic calibration between the camera and the ToF sensor. Data acquisition and processing are performed on a Raspberry Pi 4, a platform characterized by limited computational resources. For this reason, the entire pipeline, developed in ROS 2, was optimized to ensure real-time operation, with an operating frequency of approximately 7 Hz. The ROS 2 node chain generates a dense XYZRGB point cloud, expressed in the reference frame of the RGB camera and enriched with the luminous intensity of the rectified image. This point cloud is obtained through bilinear interpolation of the ToF measurements over a virtual 32×24 grid: this technique reduces the variance of the interpolated depth, making the metric component more stable than the raw point cloud produced by the ToF sensor alone. Based on both the interpolated point cloud and the raw ToF point cloud, a plane-estimation system was developed, consisting of three nodes: two independent estimators and one correction estimator. The two main estimators compute the plane normal and the roll and pitch angles, as well as the sensor-to-pad distance. The estimator based on the interpolated point cloud overcomes the intrinsic limitations of the ToF sensor, enabling reliable estimation of the plane inclination up to a distance of approximately 2.3 m. The correction estimator adaptively combines the RGB-D and ToF estimates based on their statistical stability and on the sensor-to-pad distance, dynamically selecting the most reliable source at each instant. The pipeline was experimentally validated using a 45×35 cm landing pad positioned at different sensor–pad distances and under varying environmental conditions: low, medium or high illumination, and presence or absence of background objects. The results confirm the reliability and robustness of the system, which is capable of continuously and accurately estimating the inclination of the surface beneath the drone, while complying with the computational constraints imposed by the embedded hardware.
Questa tesi presenta lo sviluppo di una pipeline percettiva RGB–ToF per la stima in tempo reale dell’inclinazione della superficie di atterraggio sottostante il quadrirotore. Il sistema integra un sensore Time-of-Flight multizona, responsabile della percezione metrica della profondità, e una camera RGB ad alta risoluzione, che fornisce la componente fotometrica della scena. I due dispositivi sono stati montati rigidamente mediante un supporto dedicato, progettato in CAD e realizzato tramite stampa 3D, e sono stati sottoposti a una completa procedura di calibrazione: calibrazione intrinseca della camera RGB e calibrazione estrinseca tra la camera e il sensore ToF. L’acquisizione e l’elaborazione dei dati avvengono su un Raspberry Pi 4, piattaforma caratterizzata da risorse computazionali limitate. Per questo motivo l’intera pipeline, sviluppata in ROS 2, è stata ottimizzata per garantire un’elaborazione in tempo reale, con una frequenza operativa di circa 7 Hz. La catena di nodi ROS 2 genera una point cloud XYZRGB densa, espressa nel sistema di riferimento della camera RGB e arricchita con l’intensità luminosa dell’immagine rettificata. Tale point cloud è ottenuta mediante interpolazione bilineare delle misure ToF su una griglia virtuale di 32×24 celle: questa tecnica ha permesso di ridurre la varianza della profondità interpolata, rendendo la componente metrica più stabile rispetto alla point cloud grezza prodotta dal solo sensore ToF. Sulla base della point cloud interpolata e della point cloud ToF grezza è stato sviluppato un sistema di stima del piano articolato in tre nodi: due stimatori indipendenti e uno stimatore di correzione. I due stimatori principali ricavano la normale al piano e gli angoli di rollio e beccheggio, oltre alla distanza sensore–pad. Lo stimatore basato sulla point cloud interpolata consente di superare i limiti intrinseci del sensore ToF, permettendo una stima affidabile dell’inclinazione del piano fino a una distanza di circa 2.3 m. Lo stimatore di correzione combina infine in modo adattivo le stime RGB-D e ToF sulla base della loro stabilità statistica e della distanza sensore–pad, selezionando dinamicamente la sorgente più affidabile istante per istante. La pipeline è stata validata sperimentalmente utilizzando un pad di dimensioni 45×35 cm, posto a diverse distanze sensore–pad e in condizioni ambientali variabili: illuminazione attenuata, media o elevata, e presenza o assenza di oggetti nello sfondo. I risultati confermano l’affidabilità e la robustezza del sistema, che si dimostra in grado di stimare in modo continuo e preciso l’inclinazione della superficie sottostante al drone, rispettando i vincoli computazionali imposti dall’hardware embedded.
Stima dell’inclinazione della superficie per l’atterraggio slope-aware di quadrirotori mediante fusione sensoriale RGB-ToF.
CALÀ LESINA, PATRICK
2024/2025
Abstract
This thesis presents the development of an RGB–ToF perception pipeline for the real-time estimation of the inclination of the landing surface beneath the quadrotor. The system integrates a multi-zone Time-of-Flight sensor, responsible for the metric perception of depth, and a high-resolution RGB camera, which provides the photometric component of the scene. The two devices were rigidly mounted on a dedicated support, designed in CAD and fabricated via 3D printing, and were subjected to a complete calibration procedure: intrinsic calibration of the RGB camera and extrinsic calibration between the camera and the ToF sensor. Data acquisition and processing are performed on a Raspberry Pi 4, a platform characterized by limited computational resources. For this reason, the entire pipeline, developed in ROS 2, was optimized to ensure real-time operation, with an operating frequency of approximately 7 Hz. The ROS 2 node chain generates a dense XYZRGB point cloud, expressed in the reference frame of the RGB camera and enriched with the luminous intensity of the rectified image. This point cloud is obtained through bilinear interpolation of the ToF measurements over a virtual 32×24 grid: this technique reduces the variance of the interpolated depth, making the metric component more stable than the raw point cloud produced by the ToF sensor alone. Based on both the interpolated point cloud and the raw ToF point cloud, a plane-estimation system was developed, consisting of three nodes: two independent estimators and one correction estimator. The two main estimators compute the plane normal and the roll and pitch angles, as well as the sensor-to-pad distance. The estimator based on the interpolated point cloud overcomes the intrinsic limitations of the ToF sensor, enabling reliable estimation of the plane inclination up to a distance of approximately 2.3 m. The correction estimator adaptively combines the RGB-D and ToF estimates based on their statistical stability and on the sensor-to-pad distance, dynamically selecting the most reliable source at each instant. The pipeline was experimentally validated using a 45×35 cm landing pad positioned at different sensor–pad distances and under varying environmental conditions: low, medium or high illumination, and presence or absence of background objects. The results confirm the reliability and robustness of the system, which is capable of continuously and accurately estimating the inclination of the surface beneath the drone, while complying with the computational constraints imposed by the embedded hardware.| File | Dimensione | Formato | |
|---|---|---|---|
|
CalàLesina_Patrick.pdf
accesso aperto
Dimensione
6.45 MB
Formato
Adobe PDF
|
6.45 MB | Adobe PDF | Visualizza/Apri |
The text of this website © Università degli studi di Padova. Full Text are published under a non-exclusive license. Metadata are under a CC0 License
https://hdl.handle.net/20.500.12608/99742