Accurate estimation of the position and orientation of a spacecraft during proximity operations—such as rendezvous, docking, on-orbit servicing (OOS), and active debris removal (ADR)—is critical to ensuring mission success and safety. Tradi- tional visual navigation methods based on hand-engineered fea- ture matching often struggle with robustness and generalization, while existing deep learning approaches face limitations due to heuristic hyperparameter tuning and limited training data. In this work, a novel convolutional neural network (CNN)-based architecture for monocular pose estimation of non-cooperative spacecraft is proposed, specifically designed to improve robust- ness across diverse operational scenarios. The model is trained on a high-fidelity synthetic dataset comprising approximately 25,000 images, simulating realistic proximity conditions with variations in lighting, background textures, and spacecraft geometries. To assess its performance, an extensive benchmarking study is conducted against representative State-of-the-Art methods using standardized evaluation metrics and controlled test conditions. The results demonstrate the competitive performance of the proposed method and provide critical insights into the factors affecting pose estimation accuracy in realistic spaceborne appli- cations

Deep Learning-Optimized Monocular Navigation for Autonomous Rendezvous and Proximity Maneuvers in Small Satellite Missions / Lovaglio, Lucrezia; Stesina, Fabrizio. - (2025), pp. 459-464. (Intervento presentato al convegno 2025 IEEE 12th International Workshop on Metrology for AeroSpace tenutosi a Napoli (Ita) nel 18-20 June, 2025) [10.1109/MetroAeroSpace64938.2025.11114522].

Deep Learning-Optimized Monocular Navigation for Autonomous Rendezvous and Proximity Maneuvers in Small Satellite Missions

Lovaglio, Lucrezia;Stesina, Fabrizio
2025

Abstract

Accurate estimation of the position and orientation of a spacecraft during proximity operations—such as rendezvous, docking, on-orbit servicing (OOS), and active debris removal (ADR)—is critical to ensuring mission success and safety. Tradi- tional visual navigation methods based on hand-engineered fea- ture matching often struggle with robustness and generalization, while existing deep learning approaches face limitations due to heuristic hyperparameter tuning and limited training data. In this work, a novel convolutional neural network (CNN)-based architecture for monocular pose estimation of non-cooperative spacecraft is proposed, specifically designed to improve robust- ness across diverse operational scenarios. The model is trained on a high-fidelity synthetic dataset comprising approximately 25,000 images, simulating realistic proximity conditions with variations in lighting, background textures, and spacecraft geometries. To assess its performance, an extensive benchmarking study is conducted against representative State-of-the-Art methods using standardized evaluation metrics and controlled test conditions. The results demonstrate the competitive performance of the proposed method and provide critical insights into the factors affecting pose estimation accuracy in realistic spaceborne appli- cations
2025
979-8-3315-0152-5
File in questo prodotto:
File Dimensione Formato  
IEEE_mfa_Loaglio_final.pdf

accesso aperto

Tipologia: 2. Post-print / Author's Accepted Manuscript
Licenza: Pubblico - Tutti i diritti riservati
Dimensione 1.58 MB
Formato Adobe PDF
1.58 MB Adobe PDF Visualizza/Apri
Deep_Learning-Optimized_Monocular_Navigation_for_Autonomous_Rendezvous_and_Proximity_Maneuvers_in_Small_Satellite_Missions.pdf

accesso riservato

Tipologia: 2a Post-print versione editoriale / Version of Record
Licenza: Non Pubblico - Accesso privato/ristretto
Dimensione 1.8 MB
Formato Adobe PDF
1.8 MB Adobe PDF   Visualizza/Apri   Richiedi una copia
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11583/3002697