Segmentation-driven spacecraft pose estimation for vision-based relative navigation in space
Kajak, Karl Martin and Maddock, Christie and Frei, Heike and Schwenk, Kurt; (2021) Segmentation-driven spacecraft pose estimation for vision-based relative navigation in space. In: IAF Astrodynamics Symposium 2021. Proceedings of the International Astronautical Congress, IAC . International Astronautical Federation (IAF), ARE. ISBN 9781713843078
Preview |
Text.
Filename: Kajak_etal_IAC_2021_Segmentation_driven_spacecraft_pose_estimation_for_vision_based_relative_navigation_in_space.pdf
Accepted Author Manuscript Download (4MB)| Preview |
Abstract
Vision-based relative navigation technology is a key enabler of several areas of the space industry such as on-orbit servicing, space debris removal, and formation flying. A particularly demanding scenario is navigating relative to a non-cooperative target that does not offer any navigational aid and is unable to stabilize its attitude. Previously, the state-of-the-art in vision-based relative navigation has relied on image processing and template matching techniques. However, outside of the space industry, state-of-the-art object pose estimation techniques are dominated by convolutional neural networks (CNNs). This is due to CNNs flexibility towards arbitrary pose estimation targets, their ability to use whatever available target features, and robustness towards varied lighting conditions, damage to targets, occlusions, and other effects that might interfere with the image. The use of CNNs for visual relative navigation is still relatively unexplored in terms of how their unique advantages can best be exploited. This research aims to integrate a state-of-the-art CNN-based pose estimation architecture in a relative navigation system. The system's navigation performance is benchmarked on realistic images gathered from the European Proximity Operations Simulator 2.0 (EPOS 2.0) robotic hardware-in-the-loop laboratory. A synthetic dataset is generated using Blender as a rendering engine. A segmentation-based 6D pose estimation CNN is trained using the synthetic dataset and the resulting pose estimation performance is evaluated on a set of real images gathered from the cameras of the EPOS 2.0 robotic close-range relative navigation laboratory. It is demonstrated that a synthetic-image-trained CNN-based pose estimation pipeline is able to successfully perform in a close-range visual navigation setting on real camera images of spacecraft that exhibits, though with some limitations that still have to be surpassed for the system to be ready for operation. Furthermore, it is able to do so with a symmetric target, a common difficulty with neural networks in a pose estimation setting.
ORCID iDs
Kajak, Karl Martin, Maddock, Christie ORCID: https://orcid.org/0000-0003-1079-4863, Frei, Heike and Schwenk, Kurt;-
-
Item type: Book Section ID code: 78520 Dates: DateEvent25 October 2021Published30 April 2021AcceptedNotes: Paper code IAC-21,C1,1,4,x66766 Subjects: Technology > Motor vehicles. Aeronautics. Astronautics Department: Faculty of Engineering > Mechanical and Aerospace Engineering
Strategic Research Themes > Ocean, Air and SpaceDepositing user: Pure Administrator Date deposited: 11 Nov 2021 12:16 Last modified: 11 Nov 2024 15:32 Related URLs: URI: https://strathprints.strath.ac.uk/id/eprint/78520