This thesis investigates pose estimation algorithms for event cameras. Event cameras are innovative devices that detect motion faster than the update rate of traditional videos, and computer vision algorithms employ the new visual input to estimate trajectories in terrestrial or space applications. In particular, this work describes pose estimation in a simulated environment of deep space navigation, namely a spacecraft’s circular orbit around a comet. From the simulation, the code must detect features, track their evolution, and establish how the camera moves without human intervention. Two main procedures are implemented to achieve this goal, one based entirely on event cameras and the other combining traditional and event videos. Together with the results, a complete analysis of the algorithms is provided: starting from the raw data captured from the scene, this thesis will describe how computer vision determines the relative motion between the camera and the features and how it represents the scene in three-dimensional coordinates.

This thesis investigates pose estimation algorithms for event cameras. Event cameras are innovative devices that detect motion faster than the update rate of traditional videos, and computer vision algorithms employ the new visual input to estimate trajectories in terrestrial or space applications. In particular, this work describes pose estimation in a simulated environment of deep space navigation, namely a spacecraft’s circular orbit around a comet. From the simulation, the code must detect features, track their evolution, and establish how the camera moves without human intervention. Two main procedures are implemented to achieve this goal, one based entirely on event cameras and the other combining traditional and event videos. Together with the results, a complete analysis of the algorithms is provided: starting from the raw data captured from the scene, this thesis will describe how computer vision determines the relative motion between the camera and the features and how it represents the scene in three-dimensional coordinates.

Event-driven 3D reconstruction of small celestial bodies in a simulated laboratory environment

ZANIN, MATTEO
2024/2025

Abstract

This thesis investigates pose estimation algorithms for event cameras. Event cameras are innovative devices that detect motion faster than the update rate of traditional videos, and computer vision algorithms employ the new visual input to estimate trajectories in terrestrial or space applications. In particular, this work describes pose estimation in a simulated environment of deep space navigation, namely a spacecraft’s circular orbit around a comet. From the simulation, the code must detect features, track their evolution, and establish how the camera moves without human intervention. Two main procedures are implemented to achieve this goal, one based entirely on event cameras and the other combining traditional and event videos. Together with the results, a complete analysis of the algorithms is provided: starting from the raw data captured from the scene, this thesis will describe how computer vision determines the relative motion between the camera and the features and how it represents the scene in three-dimensional coordinates.
2024
Event-driven 3D reconstruction of small celestial bodies in a simulated laboratory environment
This thesis investigates pose estimation algorithms for event cameras. Event cameras are innovative devices that detect motion faster than the update rate of traditional videos, and computer vision algorithms employ the new visual input to estimate trajectories in terrestrial or space applications. In particular, this work describes pose estimation in a simulated environment of deep space navigation, namely a spacecraft’s circular orbit around a comet. From the simulation, the code must detect features, track their evolution, and establish how the camera moves without human intervention. Two main procedures are implemented to achieve this goal, one based entirely on event cameras and the other combining traditional and event videos. Together with the results, a complete analysis of the algorithms is provided: starting from the raw data captured from the scene, this thesis will describe how computer vision determines the relative motion between the camera and the features and how it represents the scene in three-dimensional coordinates.
Event Cameras
Feature Tracking
Pose Estimation
File in questo prodotto:
File Dimensione Formato  
Zanin_Matteo.pdf

accesso aperto

Dimensione 5.54 MB
Formato Adobe PDF
5.54 MB Adobe PDF Visualizza/Apri

The text of this website © Università degli studi di Padova. Full Text are published under a non-exclusive license. Metadata are under a CC0 License

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.12608/94282