A novel visual attention method for target detection from SAR images

Gao, Fei and Liu, Aidong and Liu, Kai and Yang, Erfu and Hussain, Amir (2019) A novel visual attention method for target detection from SAR images. Chinese Journal of Aeronautics, 32 (8). pp. 1946-1958. ISSN 1000-9361 (https://doi.org/10.1016/j.cja.2019.03.021)

[thumbnail of Gao-etal-CJA-2019-A-novel-visual-attention-method-for-target-detection]
Text. Filename: Gao_etal_CJA_2019_A_novel_visual_attention_method_for_target_detection.pdf
Accepted Author Manuscript
License: Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 logo

Download (1MB)| Preview


Synthetic Aperture Radar (SAR) imaging systems have been widely used in civil and military fields due to their all-weather and all-day abilities and various other advantages. However, due to image data exponentially increasing, there is a need for novel automatic target detection and recognition technologies. In recent years, the visual attention mechanism in the visual system has helped humans effectively deal with complex visual signals. In particular, biologically inspired top-down attention models have garnered much attention recently. This paper presents a visual attention model for SAR target detection, comprising a bottom-up stage and top-down process. In the bottom-up step, the Itti model is improved based on the difference between SAR and optical images. The top-down step fully utilizes prior information to further detect targets. Extensive detection experiments carried out on the benchmark Moving and Stationary Target Acquisition and Recognition (MSTAR) dataset show that, compared with typical visual models and other popular detection methods, our model has increased ability and robustness for SAR target detection, under a range of Signal to Clutter Ratio (SCR) conditions and scenes. In addition, results obtained using only the bottom-up stage are inferior to those of the proposed method, further demonstrating the effectiveness and rationality of a top-down strategy. In summary, our proposed visual attention method can be considered a potential benchmark resource for the SAR research community.