Visual pose estimation system for autonomous rendezvous of spacecraft

Post, Mark A. and Yan, Xiu T. and Li, Junquan and Clark, Craig (2015) Visual pose estimation system for autonomous rendezvous of spacecraft. In: 13th Symposium on Advanced Space Technologies in Robotics and Automation (ASTRA 2015), 2015-05-11 - 2015-05-13.

[thumbnail of Post-etal-ASTRA2015-visual-pose-estimation-system-autonomous-rendezvous-spacecraft]
Preview
Text. Filename: Post_etal_ASTRA2015_visual_pose_estimation_system_autonomous_rendezvous_spacecraft.pdf
Accepted Author Manuscript

Download (1MB)| Preview

Abstract

In this work, a tracker spacecraft equipped with a short-range vision system is tasked with visually identifying a target spacecraft and determining its relative angular velocity and relative linear velocity using only visual information from onboard cameras. Focusing on methods that are feasible for implementation on relatively simple spacecraft hardware, we locate and track objects in three-dimensional space using conventional high-resolution cameras, saving cost and power compared to laser or infrared ranging systems. Identification of the target is done by means of visual feature detection and tracking across rapid, successive frames, taking the perspective matrix of the camera system into account, and building feature maps in three dimensions over time. Features detected in two-dimensional images are matched and triangulated to provide three-dimensional feature maps using structure-from-motion techniques. This methodology allows one, two, or more cameras with known baselines to be used for triangulation, with more images resulting in higher accuracy. Triangulated points are organized by means of orientation histogram descriptors and used to identify and track parts of the target spacecraft over time. This allows some estimation of the target spacecraft's motion even if parts of the spacecraft are obscured or in shadow. The state variables with respect to the camera system are extracted as a relative rotation quaternion and relative translation vector for the target. Robust tracking of the state variables for the target spacecraft is accomplished by an embedded adaptive unscented Kalman filter. In addition to estimation of the target quaternion from visual Information, the adaptive filter can also identify when tracking errors have occurred by measurement of the residual. Significant variations in lighting can be tolerated as long as the movement of the satellite is consistent with the system model, and illumination changes slowly enough for state variables to be estimated periodically. Inertial measurements over short periods of time can then be used to determine the movement of both the tracker and target spacecraft. In addition, with a sufficient number of features tracked, the center of mass of the target can be located. This method is tested using laboratory images of spacecraft movement with a simulated spacecraft movement model. Varying conditions are applied to demonstrate the effectiveness and limitations of the system for online estimation of the movement of a target spacecraft at close range.

ORCID iDs

Post, Mark A. ORCID logoORCID: https://orcid.org/0000-0002-1925-7039, Yan, Xiu T. ORCID logoORCID: https://orcid.org/0000-0002-3798-7414, Li, Junquan and Clark, Craig;