InFuse data fusion methodology for space robotics, awareness and machine learning

Post, Mark and Michalec, Romain and Bianco, Alessandro and Yan, Xiu and De Maio, Andrea and Labourey, Quentin and Lacroix, Simon and Gancet, Jeremi and Govindaraj, Shashank and Marinez-Gonazalez, Xavier and Domínguez, Raùl and Wehbe, Bilal and Fabich, Alexander and Souvannavong, Fabrice and Bissonnette, Vincent and Smíšek, Michal and Oumer, Nassir W. and Triebel, Rudolph and Márton, Zoltán-Csaba (2018) InFuse data fusion methodology for space robotics, awareness and machine learning. In: 69th International Astronautical Congress, 2018-10-01 - 2018-10-05, Messe Bremen Findorffstraße.

[thumbnail of Post-etal-IAC-2018-InFuse-data-fusion-methodology-for-space-robotics]
Preview
Text. Filename: Post_etal_IAC_2018_InFuse_data_fusion_methodology_for_space_robotics.pdf
Accepted Author Manuscript
License: Strathprints license 1.0

Download (5MB)| Preview

Abstract

Autonomous space vehicles such as orbital servicing satellites and planetary exploration rovers must be comprehensively aware of their environment in order to make appropriate decisions. Multi-sensor data fusion plays a vital role in providing these autonomous systems with sensory information of different types, from different locations, and at different times. The InFuse project, funded by the European Commission's Horizon 2020 Strategic Research Cluster in Space Robotics, provides the space community with an open-source Common Data Fusion Framework (CDFF) by which data may be fused in a modular fashion from multiple sensors. In this paper, we summarize the modular structure of this CDFF and show how it is used for the processing of sensor data to obtain data products for both planetary and orbital space robotic applications. Multiple sensor data from field testing that includes inertial measurements, stereo vision, and scanning laser range information is first used to produce robust multi-layered environmental maps for path planning. This information is registered and fused within the CDFF to produce comprehensive three-dimensional maps of the environment. To further explore the potential of the CDFF, we illustrate several applications of the CDFF that have been evaluated for orbital and planetary use cases of environmental reconstruction, mapping, navigation, and visual tracking. Algorithms for learning of maps, outlier detection, localization, and identification of objects are available within the CDFF and some early results from their use in space analogue scenarios are presented. These applications show how the CDFF can be used to provide a wide variety of data products for use by awareness and machine learning processes in space robots.