Unmanned aerial vehicle visual Simultaneous Localization and Mapping : a survey

Tian, Y and Yue, H and Yang, B and Ren, J (2022) Unmanned aerial vehicle visual Simultaneous Localization and Mapping : a survey. Journal of Physics: Conference Series, 2278. 012006. ISSN 1742-6596 (https://doi.org/10.1088/1742-6596/2278/1/012006)

[thumbnail of Tian-etal-JPCS-2022-Unmanned-aerial-vehicle-visual-simultaneous-localization-and-mapping]
Text. Filename: Tian_etal_JPCS_2022_Unmanned_aerial_vehicle_visual_simultaneous_localization_and_mapping.pdf
Final Published Version
License: Creative Commons Attribution 3.0 logo

Download (268kB)| Preview


Simultaneous Localization and Mapping (SLAM) has been widely applied in robotics and other vision applications, such as navigation and path planning for unmanned aerial vehicles (UAVs). UAV navigation can be regarded as the process of robot planning to reach the target location safely and quickly. In order to complete the predetermined task, the drone must fully understand its state, including position, navigation speed, heading, starting point, and target position. With the rapid development of computer vision technology, vision-based navigation has become a powerful tool for autonomous navigation. A visual sensor can provide a wealth of online environmental information, has high sensitivity, strong anti-interference ability, and is suitable for perceiving dynamic environments. Most visual sensors are passive sensors, which prevent sensing systems from being detected. Compared with traditional sensors such as global positioning system (GPS), laser lightning, and ultrasonic sensors, visual SLAM can obtain rich visual information such as color, texture and depth. In this paper, a survey is provided on the development of relevant techniques of visual SLAM, visual odometry, image stabilization and image denoising with applications to UAVs. By analyzing the existing development, some future perspectives are briefed.