A novel semi-supervised convolutional neural network method for synthetic aperture radar image recognition

Yue, Zhenyu and Gao, Fei and Xiong, Qingxu and Wang, Jun and Huang, Teng and Yang, Erfu and Zhou, Huiyu (2019) A novel semi-supervised convolutional neural network method for synthetic aperture radar image recognition. Cognitive Computation. pp. 1-12. ISSN 1866-9964 (https://doi.org/10.1007/s12559-019-09639-x)

[thumbnail of Yue-etal-CC-2019-A-novel-semi-supervised-convolutional-neural-network-method]
Preview
Text. Filename: Yue_etal_CC_2019_A_novel_semi_supervised_convolutional_neural_network_method.pdf
Accepted Author Manuscript

Download (1MB)| Preview

Abstract

Synthetic aperture radar (SAR) automatic target recognition (ATR) technology is one of the research hotspots in the field of image cognitive learning. Inspired by the human cognitive process, experts have designed convolutional neural network (CNN)-based SAR ATR methods. However, the performance of CNN significantly deteriorates when the labeled samples are insufficient. To effectively utilize the unlabeled samples, we present a novel semi-supervised CNN method. In the training process of our method, the information contained in the unlabeled samples is integrated into the loss function of CNN. Specifically, we first utilize CNN to obtain the class probabilities of the unlabeled samples. Thresholding processing is performed to optimize the class probabilities so that the reliability of the unlabeled samples is improved. Afterward, the optimized class probabilities are used to calculate the scatter matrices of the linear discriminant analysis (LDA) method. Finally, the loss function of CNN is modified by the scatter matrices. We choose ten types of targets from the Moving and Stationary Target Acquisition and Recognition (MSTAR) dataset. The experimental results show that the recognition accuracy of our method is significantly higher than other semi-supervised methods. It has been proved that our method can effectively improve the SAR ATR accuracy when labeled samples are insufficient.