Back to Search Start Over

PVSPE: A pyramid vision multitask transformer network for spacecraft pose estimation.

Authors :
Yang, Hong
Xiao, Xueming
Yao, Meibao
Xiong, Yonggang
Cui, Hutao
Fu, Yuegang
Source :
Advances in Space Research. Aug2024, Vol. 74 Issue 3, p1327-1342. 16p.
Publication Year :
2024

Abstract

• Transformer-based multitask network is designed for spacecraft pose estimation. • Matrix Fisher & Gaussian distributions are utilized for the pose uncertainty. • The robust neural network-based network vision navigation pipeline is created. Spacecraft pose estimation (SPE) plays a vital role in the relative navigation system for on-orbit servicing and active debris removal. Current deep learning-based methods have made great achievements on object pose estimation. However, towards the challenging onboard SPE missions, most existing Convolutional Neural Network (CNN) methods failed to capture remote vision attention, leading to the reduction of accuracy and robustness. In this paper, we presented an end-to-end multi-task Pyramid Transformer SPE network (PVSPE) consisting of two novel feature extraction modules: EnhancedPVT (EnPVT) and SlimGFPN. The EnPVT module is designed to combine global spatial and channel attention, while the Slim GFPN module can fuse features more effectively. Matrix Fisher and multivariate Gaussian distributions are further employed to model the uncertainty of pose regression to increase its accuracy. Extensive experiments are carried out on challenging SPEED + and SHIRT datasets, to validate the performances on pose estimation and vision-based navigation, respectively. The results show that the proposed PVSPE model achieved high accuracy for SPE on the SPEED + dataset even under different scales and severe illumination, demonstrating its robustness and high generalization. Leveraging the insightful uncertainty model of PVSPE, the vision-based navigation pipeline, combined with Kalman filters, accurately estimated the satellite pose under challenging rendezvous scenarios on the SHIRT dataset, with degree-level attitude errors and centimeter-level translation accuracy at steady-state. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
02731177
Volume :
74
Issue :
3
Database :
Academic Search Index
Journal :
Advances in Space Research
Publication Type :
Academic Journal
Accession number :
177907717
Full Text :
https://doi.org/10.1016/j.asr.2024.05.011